Dec 05 08:24:07 crc systemd[1]: Starting Kubernetes Kubelet... Dec 05 08:24:07 crc restorecon[4589]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:07 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 05 08:24:08 crc restorecon[4589]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 05 08:24:08 crc kubenswrapper[4795]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.565091 4795 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569728 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569755 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569762 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569767 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569773 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569779 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569785 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569792 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569800 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569806 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569813 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569819 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569824 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569829 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569835 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569841 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569846 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569852 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569857 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569862 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569868 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569874 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569880 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569885 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569891 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569897 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569902 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569907 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569914 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569920 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569925 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569930 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569938 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569945 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569953 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569959 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569965 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569970 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569978 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569983 4795 feature_gate.go:330] unrecognized feature gate: Example Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569990 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.569996 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570001 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570006 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570012 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570017 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570032 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570040 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570045 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570051 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570056 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570062 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570067 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570072 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570078 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570083 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570088 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570094 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570099 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570104 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570110 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570117 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570124 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570130 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570137 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570142 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570148 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570157 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570164 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570170 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.570177 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570513 4795 flags.go:64] FLAG: --address="0.0.0.0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570536 4795 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570548 4795 flags.go:64] FLAG: --anonymous-auth="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570556 4795 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570564 4795 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570570 4795 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570579 4795 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570586 4795 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570593 4795 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570599 4795 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570606 4795 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570639 4795 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570646 4795 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570652 4795 flags.go:64] FLAG: --cgroup-root="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570658 4795 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570664 4795 flags.go:64] FLAG: --client-ca-file="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570671 4795 flags.go:64] FLAG: --cloud-config="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570677 4795 flags.go:64] FLAG: --cloud-provider="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570683 4795 flags.go:64] FLAG: --cluster-dns="[]" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570690 4795 flags.go:64] FLAG: --cluster-domain="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570696 4795 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570702 4795 flags.go:64] FLAG: --config-dir="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570708 4795 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570715 4795 flags.go:64] FLAG: --container-log-max-files="5" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570723 4795 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570729 4795 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570735 4795 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570742 4795 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570749 4795 flags.go:64] FLAG: --contention-profiling="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570756 4795 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570762 4795 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570769 4795 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570775 4795 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570783 4795 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570789 4795 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570796 4795 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570802 4795 flags.go:64] FLAG: --enable-load-reader="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570808 4795 flags.go:64] FLAG: --enable-server="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570815 4795 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570823 4795 flags.go:64] FLAG: --event-burst="100" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570829 4795 flags.go:64] FLAG: --event-qps="50" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570835 4795 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570842 4795 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570848 4795 flags.go:64] FLAG: --eviction-hard="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570856 4795 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570862 4795 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570868 4795 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570875 4795 flags.go:64] FLAG: --eviction-soft="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570881 4795 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570887 4795 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570894 4795 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570900 4795 flags.go:64] FLAG: --experimental-mounter-path="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570906 4795 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570912 4795 flags.go:64] FLAG: --fail-swap-on="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570918 4795 flags.go:64] FLAG: --feature-gates="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570925 4795 flags.go:64] FLAG: --file-check-frequency="20s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570932 4795 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570938 4795 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570944 4795 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570951 4795 flags.go:64] FLAG: --healthz-port="10248" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570957 4795 flags.go:64] FLAG: --help="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570963 4795 flags.go:64] FLAG: --hostname-override="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570969 4795 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570976 4795 flags.go:64] FLAG: --http-check-frequency="20s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570982 4795 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570989 4795 flags.go:64] FLAG: --image-credential-provider-config="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.570995 4795 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571001 4795 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571007 4795 flags.go:64] FLAG: --image-service-endpoint="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571013 4795 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571019 4795 flags.go:64] FLAG: --kube-api-burst="100" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571025 4795 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571032 4795 flags.go:64] FLAG: --kube-api-qps="50" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571038 4795 flags.go:64] FLAG: --kube-reserved="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571044 4795 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571050 4795 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571056 4795 flags.go:64] FLAG: --kubelet-cgroups="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571062 4795 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571068 4795 flags.go:64] FLAG: --lock-file="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571074 4795 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571080 4795 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571087 4795 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571096 4795 flags.go:64] FLAG: --log-json-split-stream="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571102 4795 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571108 4795 flags.go:64] FLAG: --log-text-split-stream="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571114 4795 flags.go:64] FLAG: --logging-format="text" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571120 4795 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571127 4795 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571133 4795 flags.go:64] FLAG: --manifest-url="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571139 4795 flags.go:64] FLAG: --manifest-url-header="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571147 4795 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571154 4795 flags.go:64] FLAG: --max-open-files="1000000" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571162 4795 flags.go:64] FLAG: --max-pods="110" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571168 4795 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571174 4795 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571182 4795 flags.go:64] FLAG: --memory-manager-policy="None" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571188 4795 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571194 4795 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571200 4795 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571206 4795 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571226 4795 flags.go:64] FLAG: --node-status-max-images="50" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.571300 4795 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573523 4795 flags.go:64] FLAG: --oom-score-adj="-999" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573549 4795 flags.go:64] FLAG: --pod-cidr="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573567 4795 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573594 4795 flags.go:64] FLAG: --pod-manifest-path="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573608 4795 flags.go:64] FLAG: --pod-max-pids="-1" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573666 4795 flags.go:64] FLAG: --pods-per-core="0" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573681 4795 flags.go:64] FLAG: --port="10250" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573695 4795 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573710 4795 flags.go:64] FLAG: --provider-id="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573723 4795 flags.go:64] FLAG: --qos-reserved="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573739 4795 flags.go:64] FLAG: --read-only-port="10255" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573753 4795 flags.go:64] FLAG: --register-node="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573765 4795 flags.go:64] FLAG: --register-schedulable="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573779 4795 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573835 4795 flags.go:64] FLAG: --registry-burst="10" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573848 4795 flags.go:64] FLAG: --registry-qps="5" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573860 4795 flags.go:64] FLAG: --reserved-cpus="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573873 4795 flags.go:64] FLAG: --reserved-memory="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573891 4795 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573906 4795 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573920 4795 flags.go:64] FLAG: --rotate-certificates="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573933 4795 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573946 4795 flags.go:64] FLAG: --runonce="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573960 4795 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573975 4795 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.573991 4795 flags.go:64] FLAG: --seccomp-default="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574004 4795 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574017 4795 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574033 4795 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574047 4795 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574061 4795 flags.go:64] FLAG: --storage-driver-password="root" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574073 4795 flags.go:64] FLAG: --storage-driver-secure="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574087 4795 flags.go:64] FLAG: --storage-driver-table="stats" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574101 4795 flags.go:64] FLAG: --storage-driver-user="root" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574115 4795 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574132 4795 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574145 4795 flags.go:64] FLAG: --system-cgroups="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574158 4795 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574184 4795 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574199 4795 flags.go:64] FLAG: --tls-cert-file="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574212 4795 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574233 4795 flags.go:64] FLAG: --tls-min-version="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574246 4795 flags.go:64] FLAG: --tls-private-key-file="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574259 4795 flags.go:64] FLAG: --topology-manager-policy="none" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574273 4795 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574287 4795 flags.go:64] FLAG: --topology-manager-scope="container" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574301 4795 flags.go:64] FLAG: --v="2" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574319 4795 flags.go:64] FLAG: --version="false" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574336 4795 flags.go:64] FLAG: --vmodule="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574354 4795 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.574368 4795 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574765 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574788 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574801 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574817 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574829 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574846 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574861 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574876 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574889 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574901 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574913 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574924 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574935 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574948 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574975 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574987 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.574999 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575010 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575021 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575033 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575045 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575057 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575069 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575081 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575092 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575104 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575116 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575128 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575140 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575151 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575163 4795 feature_gate.go:330] unrecognized feature gate: Example Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575174 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575185 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575197 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575208 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575221 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575232 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575244 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575255 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575268 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575280 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575291 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575303 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575315 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575328 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575384 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575405 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575420 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575436 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575449 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575464 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575476 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575488 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575499 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575512 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575524 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575536 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575547 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575558 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575570 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575581 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575594 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575606 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575670 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575686 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575699 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575718 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575730 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575741 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575757 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.575770 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.576140 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.590557 4795 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.590687 4795 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590849 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590873 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590890 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590912 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590929 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590943 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590953 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590964 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590974 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590984 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.590993 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591002 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591011 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591023 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591035 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591045 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591055 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591064 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591073 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591085 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591096 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591107 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591116 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591126 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591136 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591148 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591159 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591171 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591182 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591193 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591204 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591216 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591226 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591235 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591250 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591262 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591274 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591286 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591297 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591310 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591321 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591333 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591345 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591357 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591368 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591380 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591392 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591403 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591415 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591427 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591438 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591488 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591498 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591509 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591521 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591532 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591543 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591556 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591568 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591579 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591590 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591602 4795 feature_gate.go:330] unrecognized feature gate: Example Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591613 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591691 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591704 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591716 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591727 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591739 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591752 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591763 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.591777 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.591797 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592131 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592156 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592168 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592180 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592191 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592203 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592214 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592223 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592232 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592241 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592251 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592260 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592269 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592277 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592286 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592295 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592304 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592313 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592322 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592334 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592349 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592362 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592377 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592390 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592402 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592414 4795 feature_gate.go:330] unrecognized feature gate: Example Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592427 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592439 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592450 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592461 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592472 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592488 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592502 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592515 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592529 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592542 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592553 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592564 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592575 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592587 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592598 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592608 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592660 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592672 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592685 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592697 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592709 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592720 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592731 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592742 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592753 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592765 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592777 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592788 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592802 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592814 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592825 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592837 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592848 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592859 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592870 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592882 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592894 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592905 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592916 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592927 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592939 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592956 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592971 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592983 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.592996 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.593011 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.593814 4795 server.go:940] "Client rotation is on, will bootstrap in background" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.600227 4795 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.600468 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.601895 4795 server.go:997] "Starting client certificate rotation" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.601932 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.602119 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-18 22:14:03.871025629 +0000 UTC Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.602207 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.609647 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.610924 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.612467 4795 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.620479 4795 log.go:25] "Validated CRI v1 runtime API" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.637607 4795 log.go:25] "Validated CRI v1 image API" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.640400 4795 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.643266 4795 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-05-08-18-42-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.643587 4795 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.666151 4795 manager.go:217] Machine: {Timestamp:2025-12-05 08:24:08.66479495 +0000 UTC m=+0.237398739 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199484928 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:57d745a0-49a1-4146-a982-16c31b0a2ce8 BootID:0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599742464 Type:vfs Inodes:3076109 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076109 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:28:ae:c2 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:28:ae:c2 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:91:3c:58 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:bf:c6:86 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1e:43:7b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9e:20:45 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:56:b4:5b:a0:2d:d1 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:aa:a9:be:5c:ce:8e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199484928 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.666451 4795 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.666705 4795 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.667200 4795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.667409 4795 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.667445 4795 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.667716 4795 topology_manager.go:138] "Creating topology manager with none policy" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.667736 4795 container_manager_linux.go:303] "Creating device plugin manager" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.668010 4795 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.668052 4795 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.668399 4795 state_mem.go:36] "Initialized new in-memory state store" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.668523 4795 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.669457 4795 kubelet.go:418] "Attempting to sync node with API server" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.669523 4795 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.669561 4795 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.669599 4795 kubelet.go:324] "Adding apiserver pod source" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.669794 4795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.671872 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.671871 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.672218 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.672332 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.672495 4795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.673352 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.674406 4795 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675066 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675099 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675110 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675120 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675134 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675143 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675151 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675166 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675176 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675185 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675197 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675208 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675405 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.675966 4795 server.go:1280] "Started kubelet" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.676109 4795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.676488 4795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.677998 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:08 crc systemd[1]: Started Kubernetes Kubelet. Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.678547 4795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.679694 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.679734 4795 server.go:460] "Adding debug handlers to kubelet server" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.679752 4795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.681409 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.681510 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:24:46.649243825 +0000 UTC Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.681567 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 536h0m37.967681499s for next certificate rotation Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.682297 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.680110 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e44296c3a74d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 08:24:08.67593135 +0000 UTC m=+0.248535099,LastTimestamp:2025-12-05 08:24:08.67593135 +0000 UTC m=+0.248535099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.683430 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.683556 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.683908 4795 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.683948 4795 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.684220 4795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.684510 4795 factory.go:55] Registering systemd factory Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.684538 4795 factory.go:221] Registration of the systemd container factory successfully Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.684893 4795 factory.go:153] Registering CRI-O factory Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.684917 4795 factory.go:221] Registration of the crio container factory successfully Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.685009 4795 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.685041 4795 factory.go:103] Registering Raw factory Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.685062 4795 manager.go:1196] Started watching for new ooms in manager Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.686028 4795 manager.go:319] Starting recovery of all containers Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.699374 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.699980 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700008 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700024 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700037 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700049 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700063 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.700073 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701824 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701849 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701861 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701870 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701880 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.701891 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702284 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702305 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702317 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702327 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702337 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.702346 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703782 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703794 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703813 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703822 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703832 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703842 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703884 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703896 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703905 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703914 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703924 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703932 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703942 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703951 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703960 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703969 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703978 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703987 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.703996 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704004 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704014 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704024 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704033 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704043 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704052 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704061 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704070 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704079 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704090 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704099 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704108 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704117 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704129 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704139 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704148 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704159 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704169 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704201 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704211 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704243 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704253 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704262 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704272 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704282 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704294 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704306 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704318 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704330 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704341 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704379 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704391 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704400 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704409 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704418 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704427 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704436 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704445 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704453 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704463 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704472 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704486 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704496 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704505 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704515 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704525 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704572 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704597 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704608 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704638 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704651 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704662 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704674 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704688 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704700 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704710 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704721 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704733 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704744 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704755 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704766 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704778 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704789 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704800 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704812 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704831 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704844 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704856 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704869 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704883 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704898 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704910 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704925 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704939 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704952 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704963 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704974 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704986 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.704997 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705008 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705022 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705033 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705044 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705055 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705066 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705077 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705089 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705100 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705112 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705125 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705139 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705150 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705161 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705172 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705184 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705195 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705208 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705224 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705236 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705249 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705260 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705321 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705333 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705347 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705360 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705380 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705394 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705405 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705417 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705431 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705444 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705456 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705469 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705482 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705495 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705507 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705519 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705531 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705544 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705557 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.705570 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709150 4795 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709183 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709195 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709206 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709216 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709226 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709235 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709245 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709255 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709264 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709273 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709283 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709292 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709301 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709309 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709320 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709329 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709339 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709348 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709357 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709366 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709375 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709383 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709392 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709400 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709410 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709418 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709426 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709435 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709445 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709453 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709463 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709474 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709482 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709491 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709499 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709508 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709516 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709535 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709549 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709561 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709574 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709584 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709593 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709603 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709616 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709665 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709676 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709686 4795 reconstruct.go:97] "Volume reconstruction finished" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.709693 4795 reconciler.go:26] "Reconciler: start to sync state" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.711466 4795 manager.go:324] Recovery completed Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.721279 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.722887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.722942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.722955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.723895 4795 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.723918 4795 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.723942 4795 state_mem.go:36] "Initialized new in-memory state store" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.731870 4795 policy_none.go:49] "None policy: Start" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.734482 4795 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.734661 4795 state_mem.go:35] "Initializing new in-memory state store" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.744262 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.745842 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.746010 4795 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.746047 4795 kubelet.go:2335] "Starting kubelet main sync loop" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.746115 4795 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 05 08:24:08 crc kubenswrapper[4795]: W1205 08:24:08.747384 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.747472 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.782178 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.793489 4795 manager.go:334] "Starting Device Plugin manager" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794106 4795 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794124 4795 server.go:79] "Starting device plugin registration server" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794584 4795 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794599 4795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794857 4795 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794930 4795 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.794941 4795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.802030 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.846540 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.846681 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848176 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848441 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848551 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848930 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.848948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849277 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849476 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.849914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.850987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851257 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.851466 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.852472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.852527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.852542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.852978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.853028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.853047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.853208 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.853321 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.853356 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.854562 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.855717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.855752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.855764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.883672 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.894821 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.896594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.896651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.896664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.896690 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:08 crc kubenswrapper[4795]: E1205 08:24:08.897071 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912697 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912733 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912854 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.912945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.913010 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.913088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.913130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:08 crc kubenswrapper[4795]: I1205 08:24:08.913179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.014892 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.014962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.014999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015050 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015087 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015156 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015193 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015329 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.015731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.097390 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.099246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.099275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.099284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.099304 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:09 crc kubenswrapper[4795]: E1205 08:24:09.099724 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.180895 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.191876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.198867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: W1205 08:24:09.214542 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e5a7c6fb4fe0a3da9d4c76c3acb6b2260f46d413ab84fd7e8db232e00bc1b433 WatchSource:0}: Error finding container e5a7c6fb4fe0a3da9d4c76c3acb6b2260f46d413ab84fd7e8db232e00bc1b433: Status 404 returned error can't find the container with id e5a7c6fb4fe0a3da9d4c76c3acb6b2260f46d413ab84fd7e8db232e00bc1b433 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.217967 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: W1205 08:24:09.218422 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-ebc12bba5aef1f894deddc606610a28a6c6cc77d6f2917fbc969f8662cdef254 WatchSource:0}: Error finding container ebc12bba5aef1f894deddc606610a28a6c6cc77d6f2917fbc969f8662cdef254: Status 404 returned error can't find the container with id ebc12bba5aef1f894deddc606610a28a6c6cc77d6f2917fbc969f8662cdef254 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.223198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:09 crc kubenswrapper[4795]: W1205 08:24:09.223296 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ed8e2b70837922e729192321be74e88313b46aac0bf2fcf6aee4cd403a376cf5 WatchSource:0}: Error finding container ed8e2b70837922e729192321be74e88313b46aac0bf2fcf6aee4cd403a376cf5: Status 404 returned error can't find the container with id ed8e2b70837922e729192321be74e88313b46aac0bf2fcf6aee4cd403a376cf5 Dec 05 08:24:09 crc kubenswrapper[4795]: E1205 08:24:09.285047 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Dec 05 08:24:09 crc kubenswrapper[4795]: W1205 08:24:09.285816 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-2fac0e38b17bb2e8e026aeaa46aa4e1110b5ce336456279a8af7490a31650adc WatchSource:0}: Error finding container 2fac0e38b17bb2e8e026aeaa46aa4e1110b5ce336456279a8af7490a31650adc: Status 404 returned error can't find the container with id 2fac0e38b17bb2e8e026aeaa46aa4e1110b5ce336456279a8af7490a31650adc Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.500511 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.502536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.502591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.502601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.502672 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:09 crc kubenswrapper[4795]: E1205 08:24:09.503185 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.680105 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.752805 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9" exitCode=0 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.752889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.753022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2fac0e38b17bb2e8e026aeaa46aa4e1110b5ce336456279a8af7490a31650adc"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.753128 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.754305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.754343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.754360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.754740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.754771 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d173b049f58ae66a5a9f49ddbda87aeea524f0ee0e056a18cd69609abf7ce7c"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.756371 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44" exitCode=0 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.756433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.756454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed8e2b70837922e729192321be74e88313b46aac0bf2fcf6aee4cd403a376cf5"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.756533 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.757224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.757248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.757261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.758414 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc" exitCode=0 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.758473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.758494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ebc12bba5aef1f894deddc606610a28a6c6cc77d6f2917fbc969f8662cdef254"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.758579 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.759138 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.759228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.759257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.759268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.760944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.760981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.760996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.763815 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0e0ee85c9c7e5a7af118203c736191d50a2cc9281d1b55c07f18388d1d67d87e" exitCode=0 Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.763881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0e0ee85c9c7e5a7af118203c736191d50a2cc9281d1b55c07f18388d1d67d87e"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.763925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e5a7c6fb4fe0a3da9d4c76c3acb6b2260f46d413ab84fd7e8db232e00bc1b433"} Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.764008 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:09 crc kubenswrapper[4795]: W1205 08:24:09.764390 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:09 crc kubenswrapper[4795]: E1205 08:24:09.764448 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.764758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.764793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:09 crc kubenswrapper[4795]: I1205 08:24:09.764814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:10 crc kubenswrapper[4795]: W1205 08:24:10.076230 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:10 crc kubenswrapper[4795]: E1205 08:24:10.076316 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:10 crc kubenswrapper[4795]: E1205 08:24:10.086488 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Dec 05 08:24:10 crc kubenswrapper[4795]: W1205 08:24:10.115244 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:10 crc kubenswrapper[4795]: E1205 08:24:10.115308 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:10 crc kubenswrapper[4795]: W1205 08:24:10.198782 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 05 08:24:10 crc kubenswrapper[4795]: E1205 08:24:10.198864 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.304183 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.314665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.314714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.314723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.314745 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:10 crc kubenswrapper[4795]: E1205 08:24:10.315265 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.697373 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.773940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"11d4f0e50aeceaac1a7969859ab1bdbccf69bcd6f0f518ebfc0e9f172ca1b54e"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.774121 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.775119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.775152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.775165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.781433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.781503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.781518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.781697 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.782689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.782720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.782732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.785810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.785840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.785854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.785945 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.786739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.786763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.786773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.789457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.789484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.789498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.789512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.791406 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3" exitCode=0 Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.791444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3"} Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.791656 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.792742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.792855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:10 crc kubenswrapper[4795]: I1205 08:24:10.792990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.801868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827"} Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.802094 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.803810 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.803878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.803908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.805497 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505" exitCode=0 Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.805590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505"} Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.805667 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.805767 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.806970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.807027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.807047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.806998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.807158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.807187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.915831 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.917530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.917591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.917603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:11 crc kubenswrapper[4795]: I1205 08:24:11.917663 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777"} Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c"} Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816684 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816691 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee"} Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816837 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.816870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5"} Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.818322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.818897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:12 crc kubenswrapper[4795]: I1205 08:24:12.818910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.826787 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476"} Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.826828 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.826855 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.828491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:13 crc kubenswrapper[4795]: I1205 08:24:13.958059 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.427796 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.428042 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.429557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.429680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.429711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.829307 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.829362 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.830766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.830808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.830819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.831092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.831154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:14 crc kubenswrapper[4795]: I1205 08:24:14.831178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.795755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.832766 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.833877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.833930 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.833950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.867476 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.867734 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.869309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.869382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:15 crc kubenswrapper[4795]: I1205 08:24:15.869403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:16 crc kubenswrapper[4795]: I1205 08:24:16.163877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 05 08:24:16 crc kubenswrapper[4795]: I1205 08:24:16.164151 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:16 crc kubenswrapper[4795]: I1205 08:24:16.166277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:16 crc kubenswrapper[4795]: I1205 08:24:16.166346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:16 crc kubenswrapper[4795]: I1205 08:24:16.166365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.388358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.388605 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.390011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.390075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.390096 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.727146 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.727545 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.730658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.730688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.730698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.738081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.839658 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.841004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.841075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:17 crc kubenswrapper[4795]: I1205 08:24:17.841095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:18 crc kubenswrapper[4795]: I1205 08:24:18.535101 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:18 crc kubenswrapper[4795]: E1205 08:24:18.802230 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 05 08:24:18 crc kubenswrapper[4795]: I1205 08:24:18.842881 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:18 crc kubenswrapper[4795]: I1205 08:24:18.844114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:18 crc kubenswrapper[4795]: I1205 08:24:18.844159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:18 crc kubenswrapper[4795]: I1205 08:24:18.844170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.031256 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.847129 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.848581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.848689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.848712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:19 crc kubenswrapper[4795]: I1205 08:24:19.853966 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:20 crc kubenswrapper[4795]: E1205 08:24:20.656829 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187e44296c3a74d6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 08:24:08.67593135 +0000 UTC m=+0.248535099,LastTimestamp:2025-12-05 08:24:08.67593135 +0000 UTC m=+0.248535099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 08:24:20 crc kubenswrapper[4795]: I1205 08:24:20.679609 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 05 08:24:20 crc kubenswrapper[4795]: E1205 08:24:20.698772 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 08:24:20 crc kubenswrapper[4795]: I1205 08:24:20.850068 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:20 crc kubenswrapper[4795]: I1205 08:24:20.851318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:20 crc kubenswrapper[4795]: I1205 08:24:20.851384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:20 crc kubenswrapper[4795]: I1205 08:24:20.851402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:21 crc kubenswrapper[4795]: I1205 08:24:21.424850 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 08:24:21 crc kubenswrapper[4795]: I1205 08:24:21.424934 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 08:24:21 crc kubenswrapper[4795]: I1205 08:24:21.535478 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:24:21 crc kubenswrapper[4795]: I1205 08:24:21.535566 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:24:21 crc kubenswrapper[4795]: E1205 08:24:21.688700 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 05 08:24:21 crc kubenswrapper[4795]: W1205 08:24:21.889408 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 05 08:24:21 crc kubenswrapper[4795]: I1205 08:24:21.889513 4795 trace.go:236] Trace[529877424]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 08:24:11.887) (total time: 10001ms): Dec 05 08:24:21 crc kubenswrapper[4795]: Trace[529877424]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:24:21.889) Dec 05 08:24:21 crc kubenswrapper[4795]: Trace[529877424]: [10.001729929s] [10.001729929s] END Dec 05 08:24:21 crc kubenswrapper[4795]: E1205 08:24:21.889544 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 05 08:24:21 crc kubenswrapper[4795]: E1205 08:24:21.919108 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 05 08:24:22 crc kubenswrapper[4795]: I1205 08:24:22.021729 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Dec 05 08:24:22 crc kubenswrapper[4795]: I1205 08:24:22.021812 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 08:24:22 crc kubenswrapper[4795]: I1205 08:24:22.027063 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Dec 05 08:24:22 crc kubenswrapper[4795]: I1205 08:24:22.027117 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.967787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.968070 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.970093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.970153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.970180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:23 crc kubenswrapper[4795]: I1205 08:24:23.978647 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:24 crc kubenswrapper[4795]: I1205 08:24:24.860711 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:24 crc kubenswrapper[4795]: I1205 08:24:24.861946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:24 crc kubenswrapper[4795]: I1205 08:24:24.862009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:24 crc kubenswrapper[4795]: I1205 08:24:24.862028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.040674 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.057042 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.119576 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.121239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.121291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.121308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:25 crc kubenswrapper[4795]: I1205 08:24:25.121338 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:25 crc kubenswrapper[4795]: E1205 08:24:25.126380 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.989383 4795 trace.go:236] Trace[876802403]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 08:24:12.076) (total time: 14912ms): Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[876802403]: ---"Objects listed" error: 14912ms (08:24:26.989) Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[876802403]: [14.91259101s] [14.91259101s] END Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.989415 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.989651 4795 trace.go:236] Trace[1364193483]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 08:24:12.651) (total time: 14338ms): Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[1364193483]: ---"Objects listed" error: 14338ms (08:24:26.989) Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[1364193483]: [14.338456541s] [14.338456541s] END Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.989661 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.991725 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.991860 4795 trace.go:236] Trace[1981297743]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (05-Dec-2025 08:24:12.295) (total time: 14696ms): Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[1981297743]: ---"Objects listed" error: 14696ms (08:24:26.991) Dec 05 08:24:26 crc kubenswrapper[4795]: Trace[1981297743]: [14.69677251s] [14.69677251s] END Dec 05 08:24:26 crc kubenswrapper[4795]: I1205 08:24:26.991869 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.403671 4795 csr.go:261] certificate signing request csr-qrrs2 is approved, waiting to be issued Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.429809 4795 csr.go:257] certificate signing request csr-qrrs2 is issued Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.446601 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.450241 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56276->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.450317 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:56276->192.168.126.11:17697: read: connection reset by peer" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.450662 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.450793 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.482853 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.681910 4795 apiserver.go:52] "Watching apiserver" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.688593 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.689407 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.689842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.689987 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.690087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.690176 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.690172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.690106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.690093 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.690231 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.690335 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.698804 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.703293 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.705060 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.705256 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.706111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.706153 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.706246 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.706266 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.706496 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.715279 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.742778 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.766751 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.778705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.785191 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.794169 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.796838 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.796931 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.796965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.796993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797020 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797045 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797150 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797222 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797245 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797318 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797336 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797373 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797595 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797661 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797866 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.797961 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798064 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798081 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798215 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798623 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798815 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798982 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799001 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799212 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798553 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798603 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.798974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799255 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800365 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799454 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799537 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799790 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799879 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.799271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802112 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802150 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802201 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802221 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802237 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802289 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802392 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802530 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802875 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803166 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803421 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805915 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805943 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.805971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806038 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806064 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806088 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806158 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806212 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806395 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806420 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806490 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806518 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.806569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.807859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.807897 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.807926 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.807951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.807979 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808090 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808116 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808144 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808171 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808197 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808224 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808253 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809105 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809319 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809346 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809373 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809398 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809426 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809452 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809525 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809601 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809676 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809701 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809823 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809926 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809949 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810079 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810109 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810204 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810230 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810467 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810486 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810501 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810516 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810532 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810548 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810564 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810578 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810595 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810629 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810647 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810661 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810677 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810690 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810704 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810719 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810733 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810747 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810761 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810779 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810794 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810813 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810836 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810851 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810865 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810879 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810892 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.810907 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.828821 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.800937 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.801719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.802854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.803313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.804317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808867 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.808955 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809540 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.809691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.812146 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.813277 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.813506 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.814060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.814364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.815236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.816590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.816817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.817218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.817456 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.817556 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.818126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.818278 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.819119 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.819378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.819578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.819824 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.820119 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.817450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.820204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.820484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.820782 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.820856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821202 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821749 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.821980 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.822760 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.831873 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.839877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.841978 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.842520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.842580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.842638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.842842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843186 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843668 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843904 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.843899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.844128 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.844758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.844961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.845299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.846136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.846710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.848524 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.848664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.848750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.848821 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.848849 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.848865 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.848928 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:28.348909392 +0000 UTC m=+19.921513131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.849226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.849303 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.849402 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.850125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.856333 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.856455 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:28.356434309 +0000 UTC m=+19.929038048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.856867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.857286 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.857341 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:28.357331154 +0000 UTC m=+19.929934893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.857502 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.857572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.857901 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:28.357867529 +0000 UTC m=+19.930471268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.858211 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.860203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.871898 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.873183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.873559 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.874764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.874853 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.874869 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.874881 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:27 crc kubenswrapper[4795]: E1205 08:24:27.874926 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:28.374909349 +0000 UTC m=+19.947513078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.875089 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.875707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.875719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.875956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.876232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.876393 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.876567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.876747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.876837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.879780 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.879819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.880018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.880272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.880363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.880521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.880839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.891802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.892601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.892645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.892645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.893055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.893832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.893982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.894083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.896890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.897183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.897689 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.899130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.899352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901352 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.901912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.902013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.902135 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.902364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.902756 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.903052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.903054 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.903174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.903594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.903781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.905950 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.906029 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.906275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.906653 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.906767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.906858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.908657 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827" exitCode=255 Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.908990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827"} Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.909992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.910268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.910747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.910752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911605 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911665 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.911786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.912972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.913226 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.913528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916055 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916095 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916108 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916134 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916146 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916161 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916173 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916188 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916197 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916208 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916220 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916235 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916246 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916256 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916269 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916280 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916375 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916389 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916402 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916428 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916441 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916453 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916472 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916487 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916505 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916518 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916538 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916551 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916655 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916667 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916703 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916730 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916741 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.916751 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922148 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922163 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922175 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922194 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922207 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922221 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922234 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922245 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922255 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922266 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922280 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922293 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922308 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922321 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922336 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922438 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922451 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922524 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922538 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922554 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922568 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922559 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922587 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922687 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922718 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922735 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922746 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922759 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922775 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922807 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922823 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922834 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922843 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922853 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922923 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922936 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922947 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922961 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922976 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922986 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.922996 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923006 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923020 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923030 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923042 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923056 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923066 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923076 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923086 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923099 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923110 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923120 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923146 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923156 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923166 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923176 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923191 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923201 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923212 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923228 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923240 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923253 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923263 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923281 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923291 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923301 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923311 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923323 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923334 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923343 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923357 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923368 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923377 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923388 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923400 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923409 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923428 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923442 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923452 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923461 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923471 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923483 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923493 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923720 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923737 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923747 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923761 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923770 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923780 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923789 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923811 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923820 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923829 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923844 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923883 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923881 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923898 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923917 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923928 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.923957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: W1205 08:24:27.928221 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.928268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: W1205 08:24:27.928366 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~secret/serving-cert Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.928377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.928627 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.928709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.929215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.929281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.929734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930035 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.929600 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930334 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.930884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: W1205 08:24:27.931006 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~secret/v4-0-config-user-idp-0-file-data Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.931030 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.931068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.931084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.931155 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.931273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.934940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.937923 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.938471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.944925 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.947663 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.980719 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:27 crc kubenswrapper[4795]: I1205 08:24:27.999326 4795 scope.go:117] "RemoveContainer" containerID="99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.004525 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.006767 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.014196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024650 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024682 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024694 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024704 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024713 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024722 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024730 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024738 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024746 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024755 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024763 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024771 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024779 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024788 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024795 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024804 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024812 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024820 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024829 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024839 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024847 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024855 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024863 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024870 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024878 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024887 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024896 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024905 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024914 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.024972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.033499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.047938 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.091995 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.186744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.221772 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.249403 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.249937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bhxnf"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.250273 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rns2q"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.250703 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-t68zt"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.250947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.251444 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zmscs"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.251658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.251944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.252517 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.256394 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.257343 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.257411 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.257491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.257860 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.258118 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.258317 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.258487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.258693 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.259307 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.259481 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.259636 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.259782 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.260325 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.260549 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.280998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.307444 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-cnibin\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4q2r\" (UniqueName: \"kubernetes.io/projected/3273f819-71fb-4fdc-8869-dc3b787f4592-kube-api-access-z4q2r\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-rootfs\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-multus-certs\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jqn\" (UniqueName: \"kubernetes.io/projected/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-kube-api-access-95jqn\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-os-release\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.328838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-conf-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.329069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-hosts-file\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-os-release\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330267 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxmk\" (UniqueName: \"kubernetes.io/projected/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-kube-api-access-7cxmk\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-socket-dir-parent\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-bin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-etc-kubernetes\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-kubelet\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr87\" (UniqueName: \"kubernetes.io/projected/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-kube-api-access-cqr87\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cnibin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-multus\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-hostroot\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330479 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-system-cni-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-binary-copy\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-netns\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-daemon-config\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cni-binary-copy\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-k8s-cni-cncf-io\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-proxy-tls\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.330655 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-system-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.334530 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.357038 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.393866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.423342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-k8s-cni-cncf-io\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-daemon-config\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cni-binary-copy\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-proxy-tls\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-system-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-cnibin\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4q2r\" (UniqueName: \"kubernetes.io/projected/3273f819-71fb-4fdc-8869-dc3b787f4592-kube-api-access-z4q2r\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-rootfs\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-multus-certs\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jqn\" (UniqueName: \"kubernetes.io/projected/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-kube-api-access-95jqn\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-os-release\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-hosts-file\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-conf-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-os-release\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxmk\" (UniqueName: \"kubernetes.io/projected/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-kube-api-access-7cxmk\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-socket-dir-parent\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-bin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-etc-kubernetes\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-kubelet\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr87\" (UniqueName: \"kubernetes.io/projected/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-kube-api-access-cqr87\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.432995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cnibin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433014 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-multus\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-hostroot\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-system-cni-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-binary-copy\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-netns\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-netns\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.433231 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:29.433214681 +0000 UTC m=+21.005818420 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-k8s-cni-cncf-io\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.433947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-daemon-config\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434072 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434089 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434102 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434138 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:29.434130736 +0000 UTC m=+21.006734475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434173 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.434193 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:29.434188228 +0000 UTC m=+21.006791967 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.434746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cni-binary-copy\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.435709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-cnibin\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.435760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-run-multus-certs\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.435771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-system-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.435931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-cni-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.435973 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-hosts-file\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-os-release\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-conf-dir\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.436289 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.436303 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.436313 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.436342 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:29.436332677 +0000 UTC m=+21.008936416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-os-release\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436529 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-kubelet\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436532 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-rootfs\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-bin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436926 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-multus-socket-dir-parent\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.436965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-host-var-lib-cni-multus\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-etc-kubernetes\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-cnibin\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.437147 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-system-cni-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.437183 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:29.437172101 +0000 UTC m=+21.009775840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-hostroot\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437350 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-05 08:19:27 +0000 UTC, rotation deadline is 2026-09-10 22:26:20.970582513 +0000 UTC Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3273f819-71fb-4fdc-8869-dc3b787f4592-cni-binary-copy\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437767 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3273f819-71fb-4fdc-8869-dc3b787f4592-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.437394 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6710h1m52.533190206s for next certificate rotation Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.438880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-proxy-tls\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.464857 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jqn\" (UniqueName: \"kubernetes.io/projected/9dd42ab7-1f98-4f50-ae12-15ec6587bc4e-kube-api-access-95jqn\") pod \"multus-bhxnf\" (UID: \"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\") " pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.472736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.474769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxmk\" (UniqueName: \"kubernetes.io/projected/23494e8d-0824-46a2-9b0c-c447f1d5e5d0-kube-api-access-7cxmk\") pod \"machine-config-daemon-t68zt\" (UID: \"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\") " pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.485556 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4q2r\" (UniqueName: \"kubernetes.io/projected/3273f819-71fb-4fdc-8869-dc3b787f4592-kube-api-access-z4q2r\") pod \"multus-additional-cni-plugins-rns2q\" (UID: \"3273f819-71fb-4fdc-8869-dc3b787f4592\") " pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.489277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr87\" (UniqueName: \"kubernetes.io/projected/c390ffe7-ac55-487a-aabd-6e0a3245c6d8-kube-api-access-cqr87\") pod \"node-resolver-zmscs\" (UID: \"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\") " pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.500471 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.519234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.536994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.539304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.543905 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.547875 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.548604 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.562140 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.572116 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.580057 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.583869 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23494e8d_0824_46a2_9b0c_c447f1d5e5d0.slice/crio-437051986fe9913b4a03a70b2c1a80721bb36d77e58ea9c5a038eef5e4391ec1 WatchSource:0}: Error finding container 437051986fe9913b4a03a70b2c1a80721bb36d77e58ea9c5a038eef5e4391ec1: Status 404 returned error can't find the container with id 437051986fe9913b4a03a70b2c1a80721bb36d77e58ea9c5a038eef5e4391ec1 Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.593355 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zmscs" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.602140 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602367 4795 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602405 4795 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602784 4795 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602804 4795 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602845 4795 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602868 4795 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.602893 4795 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.602883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c/status\": read tcp 38.102.83.222:40560->38.102.83.222:6443: use of closed network connection" Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603003 4795 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603044 4795 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603065 4795 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603088 4795 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603115 4795 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603140 4795 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603140 4795 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603161 4795 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603167 4795 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603182 4795 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603192 4795 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603205 4795 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603216 4795 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603238 4795 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603244 4795 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603254 4795 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603259 4795 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.603307 4795 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.620146 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc390ffe7_ac55_487a_aabd_6e0a3245c6d8.slice/crio-fe7cfcb8fa8b9223bfb0ff850cc81fded4d57e4dc4d9dc3dac679e676de025e6 WatchSource:0}: Error finding container fe7cfcb8fa8b9223bfb0ff850cc81fded4d57e4dc4d9dc3dac679e676de025e6: Status 404 returned error can't find the container with id fe7cfcb8fa8b9223bfb0ff850cc81fded4d57e4dc4d9dc3dac679e676de025e6 Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.621992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.631296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bhxnf" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.650182 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rns2q" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.657546 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.699149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.699950 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xl8v5"] Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.701257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: W1205 08:24:28.702315 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3273f819_71fb_4fdc_8869_dc3b787f4592.slice/crio-a8be5a4ab023e8001b15b6752043ac33899903acfe81def4ed9c18c136d4c1be WatchSource:0}: Error finding container a8be5a4ab023e8001b15b6752043ac33899903acfe81def4ed9c18c136d4c1be: Status 404 returned error can't find the container with id a8be5a4ab023e8001b15b6752043ac33899903acfe81def4ed9c18c136d4c1be Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.704536 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.706316 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.707402 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.709499 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.710127 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.712577 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.712899 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736692 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736786 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.736986 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737004 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78k86\" (UniqueName: \"kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.737573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.746775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:28 crc kubenswrapper[4795]: E1205 08:24:28.746936 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.757574 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.758027 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.759450 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.763153 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.763834 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.765307 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.766003 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.767478 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.768599 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.771213 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.772196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.773107 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.773768 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.778342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.780774 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.781442 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.783791 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.784359 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.784987 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.785869 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.788084 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.790868 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.791384 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.798987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.801458 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.801964 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.804025 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.804462 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.805195 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.809297 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.809950 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.811171 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.813297 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.813895 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.814017 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.818328 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.818899 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.819930 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.821231 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.821718 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.823068 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.824146 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.824902 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.826740 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.827468 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.828854 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.829563 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.830681 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.834249 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.835338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.836161 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.836673 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.837842 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.838836 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78k86\" (UniqueName: \"kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839556 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.840448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.839975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.840409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.840452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.841535 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.844188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.844241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.844278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.844300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.840386 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.843107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.844864 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.846877 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.859782 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.862794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78k86\" (UniqueName: \"kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86\") pod \"ovnkube-node-xl8v5\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.869391 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.870006 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.912569 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.928220 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.928270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.928281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b80b6a0347338d76a268791020bb987ae6e33f93407098761a00f457b0a93c42"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.931847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.940480 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerStarted","Data":"1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.940536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerStarted","Data":"6a3bcab74c3779d94ccc2c6d89ebcda552a7f3c99374f17afce0fb75fe09dfdf"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.943309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zmscs" event={"ID":"c390ffe7-ac55-487a-aabd-6e0a3245c6d8","Type":"ContainerStarted","Data":"fe7cfcb8fa8b9223bfb0ff850cc81fded4d57e4dc4d9dc3dac679e676de025e6"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.945830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.945869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ed7da8ff7ad593b0be52c16beda15e2a1e6466d4a1807c5d7f869582dd7e0b78"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.954389 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.959874 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.964685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.965439 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.973731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerStarted","Data":"a8be5a4ab023e8001b15b6752043ac33899903acfe81def4ed9c18c136d4c1be"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.975320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.975372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"437051986fe9913b4a03a70b2c1a80721bb36d77e58ea9c5a038eef5e4391ec1"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.976833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7ab783443fad7b27cbc3dc27db113a165bdba731a4795e3502755e35160246ea"} Dec 05 08:24:28 crc kubenswrapper[4795]: I1205 08:24:28.980060 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.003232 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.021996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.044754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.068765 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.096983 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.115633 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.134160 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.159931 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.179823 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.200749 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.225393 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.239759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.254124 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.270656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.286675 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.327697 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.362297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.416006 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.423582 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.441296 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.449560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.449655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.449702 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:31.449686224 +0000 UTC m=+23.022289963 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.449735 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.449758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.449775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.449781 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.449804 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.449816 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.449841 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450139 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:31.450121326 +0000 UTC m=+23.022725055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450202 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:31.450181288 +0000 UTC m=+23.022785027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450277 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450314 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:31.450307121 +0000 UTC m=+23.022910860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450374 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450389 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450398 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.450425 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:31.450418164 +0000 UTC m=+23.023021903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.455793 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.461238 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.493962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.516520 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.544324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.558035 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.559205 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.571370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.583749 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.602115 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.619297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.632149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.652024 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.661850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.676026 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.711408 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.727430 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.746517 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.746648 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.746704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:29 crc kubenswrapper[4795]: E1205 08:24:29.746757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.762231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.808555 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.842527 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.845973 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.848129 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.876440 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.882838 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.883299 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.933890 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.937192 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.980121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad"} Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.981427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zmscs" event={"ID":"c390ffe7-ac55-487a-aabd-6e0a3245c6d8","Type":"ContainerStarted","Data":"bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1"} Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.982526 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" exitCode=0 Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.982592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.982656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"630d1bf9bd3891171ba8e1bb2893eb306f3c430e6b7c35654215ce270ee7288b"} Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.984799 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36" exitCode=0 Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.984849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36"} Dec 05 08:24:29 crc kubenswrapper[4795]: I1205 08:24:29.986026 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.004570 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.006053 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.035213 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.066678 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.137085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.146365 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.176029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.198934 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.204987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.248042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.271866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.317203 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.345360 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.371073 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.387461 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.451685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.466815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.486027 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.565036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.584777 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.606268 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.626687 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.644365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.659219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.674931 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.689479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.718670 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.746693 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:30 crc kubenswrapper[4795]: E1205 08:24:30.746842 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.755756 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.799156 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.838222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.879964 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.911689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:30Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.992004 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e" exitCode=0 Dec 05 08:24:30 crc kubenswrapper[4795]: I1205 08:24:30.992293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.004417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.004530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.004557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.004576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.004593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.017987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.037999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.055719 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.075276 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.114717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.156600 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.191376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.231226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.283195 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.314882 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.354436 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.419414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.441576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.472173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.472317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.472355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.472374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.472401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472537 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472554 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472567 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472713 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472762 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:35.472746218 +0000 UTC m=+27.045349957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472792 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:35.472786459 +0000 UTC m=+27.045390198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472786 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472845 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472866 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472903 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472945 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:35.472916053 +0000 UTC m=+27.045519822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.472974 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:35.472962014 +0000 UTC m=+27.045565783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.473036 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:35.473021496 +0000 UTC m=+27.045625265 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.478807 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.527575 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.530184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.530231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.530242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.530357 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.538423 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.538722 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.539744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.539785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.539803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.539823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.539838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.561555 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.565544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.565595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.565646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.565669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.565687 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.585217 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.590329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.590384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.590402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.590424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.590441 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.606389 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.611594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.611642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.611654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.611672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.611689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.627560 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.633312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.633360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.633374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.633394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.633407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.651789 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:31Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.651977 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.653881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.653918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.653929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.653945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.653957 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.746587 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.746654 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.746805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:31 crc kubenswrapper[4795]: E1205 08:24:31.746904 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.756444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.756654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.756783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.756876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.756960 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.858975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.859063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.859088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.859121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.859145 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.962184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.962360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.962441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.962547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:31 crc kubenswrapper[4795]: I1205 08:24:31.962661 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:31Z","lastTransitionTime":"2025-12-05T08:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.010560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.014683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.017805 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922" exitCode=0 Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.017855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.034767 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.064995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.065321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.065330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.065344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.065353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.075086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.088364 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.113346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.129222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.146770 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.164602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.167311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.167351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.167363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.167381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.167394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.176727 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.188299 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.226906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.269205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.269242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.269256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.269273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.269286 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.277932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.278510 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-nw8pr"] Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.278923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.281993 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.282061 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.283351 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.285391 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.315129 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.334094 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.348931 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.371570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.371643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.371657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.371676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.371691 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.379768 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.380997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b8a3269-c30f-4b78-b300-dbb66ed703b8-host\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.381046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppq2n\" (UniqueName: \"kubernetes.io/projected/9b8a3269-c30f-4b78-b300-dbb66ed703b8-kube-api-access-ppq2n\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.381127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9b8a3269-c30f-4b78-b300-dbb66ed703b8-serviceca\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.395182 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.413321 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.428095 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.438705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.450307 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.461226 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.470510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.473897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.473958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.473998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.474018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.474029 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.482275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9b8a3269-c30f-4b78-b300-dbb66ed703b8-serviceca\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.482366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b8a3269-c30f-4b78-b300-dbb66ed703b8-host\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.482388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppq2n\" (UniqueName: \"kubernetes.io/projected/9b8a3269-c30f-4b78-b300-dbb66ed703b8-kube-api-access-ppq2n\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.482455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9b8a3269-c30f-4b78-b300-dbb66ed703b8-host\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.483296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9b8a3269-c30f-4b78-b300-dbb66ed703b8-serviceca\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.528499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppq2n\" (UniqueName: \"kubernetes.io/projected/9b8a3269-c30f-4b78-b300-dbb66ed703b8-kube-api-access-ppq2n\") pod \"node-ca-nw8pr\" (UID: \"9b8a3269-c30f-4b78-b300-dbb66ed703b8\") " pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.539252 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.576708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.576744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.576754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.576767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.576776 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.578763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.592046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nw8pr" Dec 05 08:24:32 crc kubenswrapper[4795]: W1205 08:24:32.605641 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8a3269_c30f_4b78_b300_dbb66ed703b8.slice/crio-18a0b512533f6d5a8535a620372aeedbf905308fd4301fefb46e45813013b3f6 WatchSource:0}: Error finding container 18a0b512533f6d5a8535a620372aeedbf905308fd4301fefb46e45813013b3f6: Status 404 returned error can't find the container with id 18a0b512533f6d5a8535a620372aeedbf905308fd4301fefb46e45813013b3f6 Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.619212 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.653263 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.682654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.682705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.682716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.682732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.682745 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.696233 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.732200 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.747244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:32 crc kubenswrapper[4795]: E1205 08:24:32.747434 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.777000 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:32Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.785982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.786011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.786019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.786033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.786042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.888693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.888740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.888752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.888770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.888782 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.991511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.991563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.991573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.991590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:32 crc kubenswrapper[4795]: I1205 08:24:32.991601 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:32Z","lastTransitionTime":"2025-12-05T08:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.027445 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426" exitCode=0 Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.027547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.030235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nw8pr" event={"ID":"9b8a3269-c30f-4b78-b300-dbb66ed703b8","Type":"ContainerStarted","Data":"1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.030292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nw8pr" event={"ID":"9b8a3269-c30f-4b78-b300-dbb66ed703b8","Type":"ContainerStarted","Data":"18a0b512533f6d5a8535a620372aeedbf905308fd4301fefb46e45813013b3f6"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.054122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.077068 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.098042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.101948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.102019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.102039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.102064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.102084 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.115031 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.132892 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.145036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.162926 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.186225 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.200718 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.206600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.206646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.206655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.206669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.206678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.214653 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.225379 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.255734 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.291913 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.308765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.308868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.308888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.308916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.308947 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.342691 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.381857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.412092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.412139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.412153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.412171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.412183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.419332 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.463585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.495994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.518686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.518721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.518736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.518754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.518765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.537782 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.574111 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.614119 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.620700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.620729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.620739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.620754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.620764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.652314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.691661 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.722663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.722688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.722696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.722710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.722718 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.731474 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.746232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.746291 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:33 crc kubenswrapper[4795]: E1205 08:24:33.746377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:33 crc kubenswrapper[4795]: E1205 08:24:33.746454 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.783483 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.818489 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.825409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.825445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.825456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.825472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.825483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.853752 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.893111 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.927751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.927792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.927807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.927828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.927843 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:33Z","lastTransitionTime":"2025-12-05T08:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.942976 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:33 crc kubenswrapper[4795]: I1205 08:24:33.984843 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.030064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.030090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.030098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.030111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.030120 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.047299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.049507 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba" exitCode=0 Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.049534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.071273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.117105 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.132828 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.140529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.140592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.140610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.140650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.140664 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.147279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.173069 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.215319 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.243553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.243590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.243600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.243635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.243647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.254804 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.292103 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.340432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.345429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.345498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.345514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.345542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.345562 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.377222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.413003 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.454462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.454509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.454518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.454534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.454542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.462242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.493721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.541523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.556605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.556679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.556689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.556704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.556715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.574535 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:34Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.659208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.659432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.659524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.659597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.659699 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.746394 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:34 crc kubenswrapper[4795]: E1205 08:24:34.746519 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.761969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.762005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.762014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.762031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.762041 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.865062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.865104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.865112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.865127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.865135 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.968240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.968275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.968284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.968299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:34 crc kubenswrapper[4795]: I1205 08:24:34.968308 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:34Z","lastTransitionTime":"2025-12-05T08:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.056787 4795 generic.go:334] "Generic (PLEG): container finished" podID="3273f819-71fb-4fdc-8869-dc3b787f4592" containerID="0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c" exitCode=0 Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.056832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerDied","Data":"0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.072213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.072269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.072292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.072321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.072344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.078872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.097056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.116514 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.127555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.141535 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.154889 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.166417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.175541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.175571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.175580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.175592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.175602 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.185389 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.204868 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.220907 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.234589 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.248275 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.261498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.276552 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.278638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.278666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.278677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.278692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.278705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.422678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.422811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.422947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.423053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.423147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.426487 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.512751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.512931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.512963 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.512925763 +0000 UTC m=+35.085529502 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.513019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.513099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513114 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.513125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513140 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513160 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513244 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.513205491 +0000 UTC m=+35.085809260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513291 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513337 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.513327715 +0000 UTC m=+35.085931454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513413 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513444 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.513435398 +0000 UTC m=+35.086039257 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513517 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513534 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513546 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.513606 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.513566922 +0000 UTC m=+35.086170661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.525934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.526191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.526288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.526382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.526459 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.628764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.628810 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.628825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.628844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.628858 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.731353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.731816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.732008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.732178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.732327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.746758 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.746762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.747169 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:35 crc kubenswrapper[4795]: E1205 08:24:35.747301 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.840172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.840221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.840233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.840253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.840266 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.943004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.943040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.943051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.943066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:35 crc kubenswrapper[4795]: I1205 08:24:35.943078 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:35Z","lastTransitionTime":"2025-12-05T08:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.046358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.046396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.046408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.046425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.046437 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.149257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.149317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.149332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.149355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.149373 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.253642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.253665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.253673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.253688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.253696 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.355983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.356039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.356055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.356095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.356115 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.459093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.459147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.459167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.459189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.459207 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.546696 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.561457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.561518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.561528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.561544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.561555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.664582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.664705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.664739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.664771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.664790 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.747134 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:36 crc kubenswrapper[4795]: E1205 08:24:36.747295 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.768039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.768212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.768245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.768278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.768304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.871258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.871324 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.871343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.871365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.871382 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.973777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.973818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.973833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.973854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:36 crc kubenswrapper[4795]: I1205 08:24:36.973868 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:36Z","lastTransitionTime":"2025-12-05T08:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.066493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" event={"ID":"3273f819-71fb-4fdc-8869-dc3b787f4592","Type":"ContainerStarted","Data":"43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.073534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.074272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.074375 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.079043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.079138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.079148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.079168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.079183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.093576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.117581 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.119942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.121821 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.131733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.147794 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.160393 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.175871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.181471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.181504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.181515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.181532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.181542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.189628 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.201044 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.219994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.233566 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.247014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.260694 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.270471 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.283814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.283862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.283874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.283893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.283906 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.292860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.318555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.333808 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.347342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.362700 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.378164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.387030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.387059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.387069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.387082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.387090 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.398080 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.412844 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.424897 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.436147 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.447564 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.463855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.474846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.488123 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.489415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.489443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.489453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.489701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.489728 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.498726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.506658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.523703 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:37Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.592305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.592338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.592348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.592362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.592371 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.695175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.695229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.695242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.695262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.695277 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.746922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:37 crc kubenswrapper[4795]: E1205 08:24:37.747181 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.747319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:37 crc kubenswrapper[4795]: E1205 08:24:37.747451 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.798556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.798657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.798683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.798710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.798767 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.901682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.901732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.901746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.901771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:37 crc kubenswrapper[4795]: I1205 08:24:37.901785 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:37Z","lastTransitionTime":"2025-12-05T08:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.005140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.005181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.005191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.005207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.005216 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.076582 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.108033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.108166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.108195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.108220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.108238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.233919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.233971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.233981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.234003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.234018 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.337485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.337592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.337654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.337683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.337699 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.441942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.441981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.441995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.442010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.442020 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.544703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.544738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.544748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.544763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.544774 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.648822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.648859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.648870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.648884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.648893 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.746877 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:38 crc kubenswrapper[4795]: E1205 08:24:38.747074 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.751007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.751063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.751080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.751185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.751209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.770890 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.782343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.794332 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.820133 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.834119 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.849540 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.853445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.853650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.853765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.853879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.853991 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.867114 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.880014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.905194 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.926345 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.940166 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.952498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.956592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.956668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.956679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.956698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.956710 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:38Z","lastTransitionTime":"2025-12-05T08:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.966633 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.983030 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:38 crc kubenswrapper[4795]: I1205 08:24:38.997545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:38Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.059004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.059036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.059048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.059064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.059076 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.083193 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.162209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.162258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.162275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.162297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.162313 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.265152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.265207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.265223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.265243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.265256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.367759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.367825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.367843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.367869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.367888 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.471031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.471109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.471136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.471164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.471181 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.573894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.573961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.573980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.574006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.574025 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.677307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.677396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.677416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.677441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.677459 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.746990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.747085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:39 crc kubenswrapper[4795]: E1205 08:24:39.747116 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:39 crc kubenswrapper[4795]: E1205 08:24:39.747195 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.779990 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.780034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.780046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.780062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.780073 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.883131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.883188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.883207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.883233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.883253 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.986438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.986499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.986517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.986541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:39 crc kubenswrapper[4795]: I1205 08:24:39.986559 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:39Z","lastTransitionTime":"2025-12-05T08:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.088556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.088967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.089108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.089057 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/0.log" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.089204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.089294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.092539 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698" exitCode=1 Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.092591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.093794 4795 scope.go:117] "RemoveContainer" containerID="dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.118292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.132315 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.149446 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.183164 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.192137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.192165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.192175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.192195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.192206 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.203858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.222607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.243646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.260186 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.294845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.294896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.294914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.294938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.294955 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.302903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.330060 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.353855 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.377526 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.398196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.398279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.398305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.398337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.398364 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.399781 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.419739 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.437222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:40Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.501489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.501563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.501584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.501609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.501667 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.605058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.605156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.605180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.605553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.605574 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.708896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.708965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.709053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.709155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.709194 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.747305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:40 crc kubenswrapper[4795]: E1205 08:24:40.747473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.813061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.813149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.813175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.813209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.813234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.916536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.916979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.917076 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.917176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:40 crc kubenswrapper[4795]: I1205 08:24:40.917271 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:40Z","lastTransitionTime":"2025-12-05T08:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.020860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.020940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.020966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.020999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.021023 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.103183 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/0.log" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.108890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.109065 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.124327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.124403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.124422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.124451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.124471 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.143959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.162656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.190208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.206785 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227197 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.227351 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.240961 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.254745 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.271880 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.290059 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.303655 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.330139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.330175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.330187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.330201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.330211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.331267 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.352717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.381837 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.407393 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.418860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.429705 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.431801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.431850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.431867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.431887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.431900 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.443186 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.456394 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.469535 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.488547 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.500886 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.514979 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.530061 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.534060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.534124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.534136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.534153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.534165 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.548336 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244"] Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.548639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.548780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.552428 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.552784 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.573332 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.586731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.586780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.586812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rwt9\" (UniqueName: \"kubernetes.io/projected/993c8d73-2e31-4128-95a9-db06e34b8de1-kube-api-access-6rwt9\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.586948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/993c8d73-2e31-4128-95a9-db06e34b8de1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.591533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.603814 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.617033 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.630143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.637705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.637751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.637760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.637776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.637791 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.652721 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.674906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.688050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.688099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.689031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.689068 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/993c8d73-2e31-4128-95a9-db06e34b8de1-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.689098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rwt9\" (UniqueName: \"kubernetes.io/projected/993c8d73-2e31-4128-95a9-db06e34b8de1-kube-api-access-6rwt9\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.689134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/993c8d73-2e31-4128-95a9-db06e34b8de1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.694014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.697455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/993c8d73-2e31-4128-95a9-db06e34b8de1-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.709267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rwt9\" (UniqueName: \"kubernetes.io/projected/993c8d73-2e31-4128-95a9-db06e34b8de1-kube-api-access-6rwt9\") pod \"ovnkube-control-plane-749d76644c-xm244\" (UID: \"993c8d73-2e31-4128-95a9-db06e34b8de1\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.711231 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.733196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.740537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.740600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.740638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.740658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.740671 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.742041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.742069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.742078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.742089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.742098 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.747200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.747210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.747338 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.747445 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.749873 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.759022 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.763461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.763502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.763512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.763529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.763540 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.765126 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.783051 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.785398 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.790743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.790886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.790950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.791027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.791088 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.799575 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.802515 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.806790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.806995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.807066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.807170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.807233 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.816698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.820454 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.825521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.825577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.825592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.825628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.825641 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.832222 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.842590 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: E1205 08:24:41.842735 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.844935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.845051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.845082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.845102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.845122 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.853257 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.864296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.873157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: W1205 08:24:41.878226 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod993c8d73_2e31_4128_95a9_db06e34b8de1.slice/crio-d4e467daaaec8fc5b98487dc5986ecaae88fc93313ef509773f7ef0911e5370e WatchSource:0}: Error finding container d4e467daaaec8fc5b98487dc5986ecaae88fc93313ef509773f7ef0911e5370e: Status 404 returned error can't find the container with id d4e467daaaec8fc5b98487dc5986ecaae88fc93313ef509773f7ef0911e5370e Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.890292 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.912941 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.928175 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.942056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.948138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.948203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.948217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.948243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.948254 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:41Z","lastTransitionTime":"2025-12-05T08:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:41 crc kubenswrapper[4795]: I1205 08:24:41.957234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:41Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.052388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.052439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.052455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.052475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.052487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.113193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" event={"ID":"993c8d73-2e31-4128-95a9-db06e34b8de1","Type":"ContainerStarted","Data":"d4e467daaaec8fc5b98487dc5986ecaae88fc93313ef509773f7ef0911e5370e"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.155529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.155582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.155602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.155658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.155681 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.259320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.259385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.259399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.259422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.259437 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.362437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.362502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.362522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.362547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.362564 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.466344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.466409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.466427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.466483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.466502 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.569594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.569654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.569669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.569687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.569702 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.672891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.672936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.672955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.672978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.672992 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.683302 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8cnbm"] Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.684067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: E1205 08:24:42.684151 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.701980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwhlp\" (UniqueName: \"kubernetes.io/projected/6c9f96ec-f615-4030-a78d-2dd56932c6c1-kube-api-access-rwhlp\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.702156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.707558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.730223 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.745897 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.746425 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:42 crc kubenswrapper[4795]: E1205 08:24:42.746597 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.763968 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.776276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.776500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.776667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.776785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.776893 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.781958 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.800827 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.803467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: E1205 08:24:42.803750 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:42 crc kubenswrapper[4795]: E1205 08:24:42.803914 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:43.303885222 +0000 UTC m=+34.876488991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.803786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwhlp\" (UniqueName: \"kubernetes.io/projected/6c9f96ec-f615-4030-a78d-2dd56932c6c1-kube-api-access-rwhlp\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.818978 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.829994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwhlp\" (UniqueName: \"kubernetes.io/projected/6c9f96ec-f615-4030-a78d-2dd56932c6c1-kube-api-access-rwhlp\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.836370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.848246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.861555 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.880470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.880891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.880998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.881095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.881213 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.881451 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.906736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.921901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.939273 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.960605 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.976694 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.987040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.987081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.987092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.987110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.987122 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:42Z","lastTransitionTime":"2025-12-05T08:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:42 crc kubenswrapper[4795]: I1205 08:24:42.994332 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:42Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.091006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.091377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.091395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.091413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.091425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.120219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" event={"ID":"993c8d73-2e31-4128-95a9-db06e34b8de1","Type":"ContainerStarted","Data":"31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.120668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" event={"ID":"993c8d73-2e31-4128-95a9-db06e34b8de1","Type":"ContainerStarted","Data":"6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.122365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/1.log" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.123316 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/0.log" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.127607 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c" exitCode=1 Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.127708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.127749 4795 scope.go:117] "RemoveContainer" containerID="dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.128634 4795 scope.go:117] "RemoveContainer" containerID="e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.128776 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.139537 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.168996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.183977 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.194273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.194306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.194315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.194330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.194343 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.199557 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.216347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.233230 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.248343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.275895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.287731 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.297009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.297053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.297064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.297081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.297102 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.305083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.308706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.308839 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.308933 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:44.308911105 +0000 UTC m=+35.881514864 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.321010 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.334533 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.346227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.359594 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.372912 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.394235 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.399183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.399261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.399272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.399295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.399310 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.413719 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.431901 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.446639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.461760 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.477784 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.492011 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.502544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.502602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.502634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.502658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.502673 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.507930 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.523180 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.539658 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.559796 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.573994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.598124 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.608290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.608349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.608365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.608386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.608401 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.610717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.610831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.610876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.610908 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:24:59.610882781 +0000 UTC m=+51.183486570 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.610986 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.611006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611028 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611052 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:59.611035585 +0000 UTC m=+51.183639404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611061 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611086 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.611086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611140 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:59.611118937 +0000 UTC m=+51.183722746 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611144 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611192 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611209 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611222 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611194 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:59.611183749 +0000 UTC m=+51.183787568 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.611280 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:59.611271341 +0000 UTC m=+51.183875170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.624579 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.648542 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.674013 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.699093 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.711179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.711256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.711273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.711295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.711310 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.725463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.741352 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:43Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.746566 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.746573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.746931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:43 crc kubenswrapper[4795]: E1205 08:24:43.747045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.814783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.814842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.814856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.814879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.814892 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.917482 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.917568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.917582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.917630 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:43 crc kubenswrapper[4795]: I1205 08:24:43.917645 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:43Z","lastTransitionTime":"2025-12-05T08:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.020223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.020289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.020306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.020351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.020369 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.154452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.154512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.154525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.154548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.154562 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.157006 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/1.log" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.258119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.258228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.258248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.258311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.258331 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.317372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:44 crc kubenswrapper[4795]: E1205 08:24:44.317662 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:44 crc kubenswrapper[4795]: E1205 08:24:44.317841 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:46.317815171 +0000 UTC m=+37.890418950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.362359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.362472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.362495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.362562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.362583 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.466090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.466175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.466190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.466216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.466231 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.569954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.570045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.570059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.570088 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.570114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.673659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.673733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.673753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.673780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.673799 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.746857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.746949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:44 crc kubenswrapper[4795]: E1205 08:24:44.747037 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:44 crc kubenswrapper[4795]: E1205 08:24:44.747143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.777475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.777864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.777883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.777904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.777916 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.881256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.881318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.881334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.881358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.881375 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.983803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.983834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.983843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.983856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:44 crc kubenswrapper[4795]: I1205 08:24:44.983864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:44Z","lastTransitionTime":"2025-12-05T08:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.087499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.087547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.087560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.087578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.087592 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.190921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.190967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.190978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.190996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.191007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.295557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.295661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.295679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.295708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.295726 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.398739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.398896 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.398910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.398936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.398953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.501370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.501421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.501432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.501451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.501464 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.604474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.604524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.604534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.604554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.604564 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.708244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.708321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.708342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.708385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.708407 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.747096 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.747314 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:45 crc kubenswrapper[4795]: E1205 08:24:45.747419 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:45 crc kubenswrapper[4795]: E1205 08:24:45.747526 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.812480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.812549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.812571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.812603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.812649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.916309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.916391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.916409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.916434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:45 crc kubenswrapper[4795]: I1205 08:24:45.916451 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:45Z","lastTransitionTime":"2025-12-05T08:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.020537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.020599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.020635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.020659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.020674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.124186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.124249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.124266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.124288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.124305 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.227443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.227898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.228009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.228103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.228183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.331762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.331841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.331859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.331887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.331904 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.342366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:46 crc kubenswrapper[4795]: E1205 08:24:46.342605 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:46 crc kubenswrapper[4795]: E1205 08:24:46.342711 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:50.342687355 +0000 UTC m=+41.915291134 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.435089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.435521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.435605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.435705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.435792 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.539044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.539097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.539108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.539126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.539139 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.642342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.642442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.642471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.642505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.642528 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746163 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746272 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:46 crc kubenswrapper[4795]: E1205 08:24:46.746468 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.746653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:46 crc kubenswrapper[4795]: E1205 08:24:46.746739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.849679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.849747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.849767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.849791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.849808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.953424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.953901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.954216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.954411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:46 crc kubenswrapper[4795]: I1205 08:24:46.954554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:46Z","lastTransitionTime":"2025-12-05T08:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.057947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.058008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.058031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.058061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.058088 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.163356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.163414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.163431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.163456 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.163473 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.266802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.266894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.266925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.266954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.266974 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.369785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.369856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.369924 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.369953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.369991 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.472845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.472906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.472918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.472938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.472956 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.575885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.575934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.575943 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.575958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.576003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.678901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.678965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.678982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.679018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.679036 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.746192 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:47 crc kubenswrapper[4795]: E1205 08:24:47.746318 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.746603 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:47 crc kubenswrapper[4795]: E1205 08:24:47.746920 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.781645 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.781697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.781713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.781731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.781743 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.884981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.885025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.885039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.885058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.885069 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.988132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.988239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.988261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.988287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:47 crc kubenswrapper[4795]: I1205 08:24:47.988305 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:47Z","lastTransitionTime":"2025-12-05T08:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.091295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.091355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.091371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.091391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.091406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.194491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.194842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.194953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.195032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.195104 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.297808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.297867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.297880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.297899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.297912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.401089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.401467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.401682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.401908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.402057 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.505438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.505536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.505558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.505588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.505651 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.608185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.608251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.608320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.608349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.608366 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.711719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.711777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.711795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.711819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.711838 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.747023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.747173 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:48 crc kubenswrapper[4795]: E1205 08:24:48.747443 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:48 crc kubenswrapper[4795]: E1205 08:24:48.747569 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.768163 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.785643 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.802358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.813985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.814029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.814041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.814089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.814101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.817752 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.834428 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.857680 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.871769 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.885523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.901852 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.916398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.917836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.918177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.918202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.918223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.918238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:48Z","lastTransitionTime":"2025-12-05T08:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.930328 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.943991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.962410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.982083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:48 crc kubenswrapper[4795]: I1205 08:24:48.997876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:48Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.021384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.021436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.021448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.021500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.021518 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.028732 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:49Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.048545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:49Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.124477 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.124522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.124536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.124555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.124568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.227385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.227424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.227434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.227451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.227465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.330884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.330922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.330934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.330950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.330962 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.434361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.434396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.434404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.434416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.434425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.537097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.537155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.537173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.537199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.537217 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.640759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.640847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.640867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.640888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.640923 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.743809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.743870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.743886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.743910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.743928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.747124 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.747253 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:49 crc kubenswrapper[4795]: E1205 08:24:49.747378 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:49 crc kubenswrapper[4795]: E1205 08:24:49.747739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.847231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.847307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.847335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.847380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.847404 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.950053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.950095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.950105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.950119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:49 crc kubenswrapper[4795]: I1205 08:24:49.950132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:49Z","lastTransitionTime":"2025-12-05T08:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.052249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.052289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.052298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.052313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.052322 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.154879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.154924 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.154936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.154953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.154967 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.258028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.258214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.258246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.258280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.258304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.361133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.361202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.361219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.361245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.361263 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.391096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:50 crc kubenswrapper[4795]: E1205 08:24:50.391268 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:50 crc kubenswrapper[4795]: E1205 08:24:50.391372 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:24:58.391345745 +0000 UTC m=+49.963949524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.464451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.464513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.464535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.464560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.464578 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.567688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.567733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.567744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.567763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.567776 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.670925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.671002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.671026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.671057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.671083 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.746681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:50 crc kubenswrapper[4795]: E1205 08:24:50.746921 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.746953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:50 crc kubenswrapper[4795]: E1205 08:24:50.747234 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.775183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.775310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.775385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.775414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.775488 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.878498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.878566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.878584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.878609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.878699 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.983668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.983731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.983744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.983764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:50 crc kubenswrapper[4795]: I1205 08:24:50.983779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:50Z","lastTransitionTime":"2025-12-05T08:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.086100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.086145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.086155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.086172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.086188 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.189023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.189083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.189095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.189119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.189135 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.298044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.298106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.298124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.298151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.298169 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.401317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.401379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.401396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.401420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.401439 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.504669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.504762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.504785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.504816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.504839 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.606766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.606833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.606870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.606899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.606921 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.710021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.710261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.710282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.710309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.710327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.747317 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.747401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:51 crc kubenswrapper[4795]: E1205 08:24:51.747488 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:51 crc kubenswrapper[4795]: E1205 08:24:51.747658 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.813687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.813747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.813765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.813791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.813812 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.917300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.917357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.917370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.917392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.917405 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.932783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.932828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.932837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.932855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.932866 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: E1205 08:24:51.952165 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:51Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.957218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.957262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.957271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.957289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.957300 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:51 crc kubenswrapper[4795]: E1205 08:24:51.975600 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:51Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.980936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.981010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.981022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.981040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:51 crc kubenswrapper[4795]: I1205 08:24:51.981051 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:51Z","lastTransitionTime":"2025-12-05T08:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.000529 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:51Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.005585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.005647 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.005663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.005687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.005702 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.020885 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:52Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.025966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.026004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.026021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.026043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.026058 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.045687 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:52Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.045922 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.048165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.048213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.048231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.048258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.048276 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.150681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.150712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.150723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.150737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.150747 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.253385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.253445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.253459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.253484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.253503 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.357046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.357111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.357128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.357159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.357178 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.460669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.460730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.460752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.460780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.460805 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.564145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.564214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.564239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.564269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.564292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.668774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.668845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.668871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.668900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.668929 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.747125 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.747139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.747280 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:52 crc kubenswrapper[4795]: E1205 08:24:52.747450 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.772496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.772554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.772591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.772657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.772684 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.876094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.876164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.876190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.876219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.876243 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.981244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.981296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.981308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.981331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:52 crc kubenswrapper[4795]: I1205 08:24:52.981344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:52Z","lastTransitionTime":"2025-12-05T08:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.085024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.085055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.085066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.085082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.085093 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.188254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.188323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.188339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.188355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.188368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.291392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.291460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.291480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.291514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.291532 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.394732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.394813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.394834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.394861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.394879 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.497386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.497483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.497495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.497513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.497525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.600782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.600832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.600843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.600861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.600873 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.703551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.703590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.703601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.703639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.703650 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.746269 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.746344 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:53 crc kubenswrapper[4795]: E1205 08:24:53.746428 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:53 crc kubenswrapper[4795]: E1205 08:24:53.746536 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.807580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.807697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.807719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.807781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.807804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.911751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.911839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.911867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.911897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:53 crc kubenswrapper[4795]: I1205 08:24:53.911922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:53Z","lastTransitionTime":"2025-12-05T08:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.015354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.015449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.015478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.015511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.015536 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.118889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.118941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.118949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.118965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.118975 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.221975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.222050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.222069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.222095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.222115 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.325424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.325515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.325533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.325559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.325576 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.429318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.429371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.429390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.429420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.429438 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.436969 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.449363 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.457826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.474796 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.493992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.516417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.533786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.533851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.533870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.533900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.533925 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.535977 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.572908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.598141 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.622219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.636393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637299 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.637772 4795 scope.go:117] "RemoveContainer" containerID="e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.644436 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.665557 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.680961 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.695918 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.715585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.728740 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.740448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.740715 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.740796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.740877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.740952 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.747137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.747226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:54 crc kubenswrapper[4795]: E1205 08:24:54.747458 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:54 crc kubenswrapper[4795]: E1205 08:24:54.747573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.752113 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff1c7d7c48d070bcbd3242bfb39363c9975518fb40deaf44128467c54bdd698\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:39Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:140\\\\nI1205 08:24:39.190899 6054 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1205 08:24:39.191396 6054 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1205 08:24:39.191414 6054 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1205 08:24:39.191431 6054 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1205 08:24:39.191436 6054 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1205 08:24:39.191449 6054 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1205 08:24:39.191455 6054 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1205 08:24:39.191461 6054 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1205 08:24:39.191474 6054 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1205 08:24:39.191476 6054 factory.go:656] Stopping watch factory\\\\nI1205 08:24:39.191482 6054 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:39.191490 6054 ovnkube.go:599] Stopped ovnkube\\\\nI1205 08:24:39.191491 6054 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:39.191504 6054 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.763846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.777392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.791722 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.809230 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.829154 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.845740 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.861297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.886306 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.901758 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.918466 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.932127 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.944347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.947862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.947894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.947906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.947919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.947928 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:54Z","lastTransitionTime":"2025-12-05T08:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.955152 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.965287 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.978353 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:54 crc kubenswrapper[4795]: I1205 08:24:54.989348 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:54Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.004815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.018840 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.044403 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.050257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.050288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.050299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.050316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.050327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.063370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.152014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.152049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.152058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.152070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.152080 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.204637 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/1.log" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.207878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.226159 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.239231 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.255449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.255515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.255532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.255554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.255570 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.259205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.282198 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.299343 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.319207 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.333358 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.352900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.357686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.357730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.357743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.357761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.357776 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.370169 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.386583 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.400314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.414522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.426830 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.437481 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.456907 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.460315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.460423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.460500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.460575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.460718 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.471146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.485412 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.497494 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:55Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.564113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.564369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.564445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.564516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.564639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.667689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.667735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.667745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.667763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.667773 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.746870 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:55 crc kubenswrapper[4795]: E1205 08:24:55.747588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.746880 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:55 crc kubenswrapper[4795]: E1205 08:24:55.747907 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.770001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.770059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.770080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.770103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.770121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.873190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.873261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.873285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.873318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.873346 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.976331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.976406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.976417 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.976438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:55 crc kubenswrapper[4795]: I1205 08:24:55.976452 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:55Z","lastTransitionTime":"2025-12-05T08:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.079446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.079485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.079497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.079512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.079523 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.181425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.181883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.182164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.182433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.182665 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.214942 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/2.log" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.216474 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/1.log" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.220370 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7" exitCode=1 Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.220440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.220497 4795 scope.go:117] "RemoveContainer" containerID="e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.221910 4795 scope.go:117] "RemoveContainer" containerID="eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7" Dec 05 08:24:56 crc kubenswrapper[4795]: E1205 08:24:56.222253 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.241233 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.261384 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.277370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.285045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.285246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.285402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.285564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.285724 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.298228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.314584 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.339196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.379237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.388145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.388183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.388193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.388208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.388219 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.408655 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.428491 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.444479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.459791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.476994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.490476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.490508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.490516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.490530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.490539 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.503363 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.525997 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.544036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.559009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.574812 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.588200 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:56Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.593129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.593199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.593214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.593250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.593266 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.696067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.696125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.696142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.696207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.696230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.747017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.747087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:56 crc kubenswrapper[4795]: E1205 08:24:56.747224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:56 crc kubenswrapper[4795]: E1205 08:24:56.747348 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.799216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.799271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.799282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.799297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.799308 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.902881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.902957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.902981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.903010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:56 crc kubenswrapper[4795]: I1205 08:24:56.903031 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:56Z","lastTransitionTime":"2025-12-05T08:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.006967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.007010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.007022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.007038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.007047 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.109907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.109974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.110002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.110037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.110058 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.213817 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.213873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.213885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.213902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.213913 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.226445 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/2.log" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.316576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.316694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.316718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.316744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.316761 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.420314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.420425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.420451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.420484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.420512 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.523527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.523652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.523707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.523742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.523765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.626450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.626504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.626521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.626554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.626572 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.729827 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.729892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.729909 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.729938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.729959 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.746970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.747130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:57 crc kubenswrapper[4795]: E1205 08:24:57.747191 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:57 crc kubenswrapper[4795]: E1205 08:24:57.747365 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.833925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.833999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.834017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.834044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.834077 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.936934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.936979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.936987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.937002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:57 crc kubenswrapper[4795]: I1205 08:24:57.937012 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:57Z","lastTransitionTime":"2025-12-05T08:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.040181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.040267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.040285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.040315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.040338 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.145084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.145141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.145154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.145172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.145185 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.249332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.249390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.249406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.249428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.249444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.369934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.370024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.370043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.370070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.370089 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.473478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.473553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.473574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.473604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.473664 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.482236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:58 crc kubenswrapper[4795]: E1205 08:24:58.482444 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:58 crc kubenswrapper[4795]: E1205 08:24:58.482536 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:14.482511625 +0000 UTC m=+66.055115404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.577320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.577377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.577394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.577478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.577498 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.680815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.680882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.680907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.680939 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.680959 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.746952 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:24:58 crc kubenswrapper[4795]: E1205 08:24:58.747961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.748345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:58 crc kubenswrapper[4795]: E1205 08:24:58.748528 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.763075 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.788398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4e79eda8413cb50b1d1cc567129840db22447ed3d6dcb27b30939a34e078f9c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"ces.LB{Name:\\\\\\\"Service_openshift-marketplace/redhat-marketplace_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:41.895068 6196 services_controller.go:452] Built service openshift-marketplace/redhat-marketplace per-node LB for network=default: []services.LB{}\\\\nI1205 08:24:41.895083 6196 services_controller.go:453] Built service openshift-marketplace/redhat-marketplace template LB for network=default: []services.LB{}\\\\nF1205 08:24:41.895091 6196 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed t\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.795230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.795305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.795328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.795360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.795382 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.808821 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.829056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.846886 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.865225 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.881523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.898168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.898243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.898255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.898288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.898324 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:58Z","lastTransitionTime":"2025-12-05T08:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.900985 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.921637 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.941799 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.956953 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:58 crc kubenswrapper[4795]: I1205 08:24:58.987917 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:58Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.001068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.001136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.001152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.001180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.001197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.009069 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.029305 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.052123 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.068254 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.083317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.098630 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:24:59Z is after 2025-08-24T17:21:41Z" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.103884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.103946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.103960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.103980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.103996 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.207719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.207768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.207779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.207796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.207808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.310390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.310445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.310457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.310476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.310487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.413560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.413653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.413681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.413711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.413734 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.516360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.516410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.516421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.516438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.516450 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.619232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.619276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.619312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.619337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.619350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.698860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.699010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.699050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.699099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699155 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:25:31.699120296 +0000 UTC m=+83.271724065 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699191 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699241 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:31.699231619 +0000 UTC m=+83.271835358 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699271 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699339 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699365 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699390 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699421 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699443 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699454 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:31.699428575 +0000 UTC m=+83.272032344 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699511 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:31.699488136 +0000 UTC m=+83.272091905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.699651 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699789 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.699837 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:31.699822035 +0000 UTC m=+83.272425814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.722663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.722736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.722751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.722769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.722783 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.746341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.746377 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.746558 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:24:59 crc kubenswrapper[4795]: E1205 08:24:59.746728 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.827072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.827142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.827162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.827191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.827210 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.930097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.930159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.930180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.930205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:24:59 crc kubenswrapper[4795]: I1205 08:24:59.930224 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:24:59Z","lastTransitionTime":"2025-12-05T08:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.033428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.033538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.033558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.033600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.033647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.137228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.137283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.137300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.137325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.137342 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.239999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.240060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.240078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.240104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.240121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.343753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.343822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.343839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.343862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.343879 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.446961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.447030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.447048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.447072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.447092 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.550692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.550779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.550812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.550845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.550866 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.653605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.653737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.653760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.653790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.653816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.747298 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.747303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:00 crc kubenswrapper[4795]: E1205 08:25:00.747539 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:00 crc kubenswrapper[4795]: E1205 08:25:00.747673 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.756833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.757243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.757684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.757900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.758152 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.861993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.862030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.862039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.862053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.862062 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.964863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.964925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.964946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.964971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:00 crc kubenswrapper[4795]: I1205 08:25:00.964990 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:00Z","lastTransitionTime":"2025-12-05T08:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.067179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.067251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.067267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.067290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.067307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.169796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.169861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.169879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.169904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.169923 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.271984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.272022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.272035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.272054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.272069 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.374745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.374802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.374815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.374850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.374863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.478110 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.478209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.478233 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.478262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.478285 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.581308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.581367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.581381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.581401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.581416 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.685003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.685069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.685080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.685099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.685110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.746857 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:01 crc kubenswrapper[4795]: E1205 08:25:01.747038 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.746864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:01 crc kubenswrapper[4795]: E1205 08:25:01.747150 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.788135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.788263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.788284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.788345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.788363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.891335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.891397 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.891409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.891425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.891434 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.995301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.995450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.995473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.995497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:01 crc kubenswrapper[4795]: I1205 08:25:01.995515 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:01Z","lastTransitionTime":"2025-12-05T08:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.098451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.098493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.098506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.098522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.098534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.202545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.202667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.202697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.202731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.202755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.306382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.306429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.306443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.306462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.306476 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.398363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.398426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.398445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.398468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.398485 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.420968 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:02Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.427359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.427653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.427797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.427979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.428327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.450702 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:02Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.457156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.457238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.457255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.457311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.457329 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.480861 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:02Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.486700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.486782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.486805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.486830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.486850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.512200 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:02Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.519519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.519662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.519691 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.519750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.519772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.539663 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:02Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.539911 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.542087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.542205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.542235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.542264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.542286 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.644831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.644899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.644912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.644927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.644939 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.746358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.746475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.746592 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:02 crc kubenswrapper[4795]: E1205 08:25:02.746833 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.748411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.748499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.748527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.748564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.748591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.853229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.853295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.853307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.853333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.853352 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.956746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.956813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.956828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.956857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:02 crc kubenswrapper[4795]: I1205 08:25:02.956875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:02Z","lastTransitionTime":"2025-12-05T08:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.059376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.059464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.059487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.059518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.059544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.162516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.162570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.162581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.162599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.162640 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.265183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.265232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.265248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.265266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.265278 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.368399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.368434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.368445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.368462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.368474 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.470695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.470743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.470755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.470775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.470787 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.572813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.572855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.572863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.572877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.572887 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.675411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.675479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.675505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.675537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.675559 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.747083 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.747175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:03 crc kubenswrapper[4795]: E1205 08:25:03.747332 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:03 crc kubenswrapper[4795]: E1205 08:25:03.747973 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.778250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.778480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.778508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.778541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.778585 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.882126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.882190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.882211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.882241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.882260 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.984868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.984933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.984952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.984978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:03 crc kubenswrapper[4795]: I1205 08:25:03.984996 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:03Z","lastTransitionTime":"2025-12-05T08:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.088277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.088331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.088347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.088369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.088386 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.191154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.191220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.191238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.191262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.191282 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.294566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.294671 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.294697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.294724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.294745 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.398033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.398077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.398090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.398106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.398118 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.500842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.500884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.500897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.500914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.500927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.604582 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.604668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.604679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.604695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.604705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.707485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.707564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.707586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.707652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.707679 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.746974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:04 crc kubenswrapper[4795]: E1205 08:25:04.747487 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.748214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:04 crc kubenswrapper[4795]: E1205 08:25:04.748591 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.811249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.811333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.811360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.811392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.811416 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.915180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.915251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.915269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.915295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:04 crc kubenswrapper[4795]: I1205 08:25:04.915312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:04Z","lastTransitionTime":"2025-12-05T08:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.018693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.018767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.018790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.018818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.018844 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.121874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.121945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.121959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.122001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.122013 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.224858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.224922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.224940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.224965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.224982 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.327553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.327721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.327742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.327766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.327782 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.430464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.430534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.430551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.430579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.430597 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.533308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.533361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.533373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.533393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.533406 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.636740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.636849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.636867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.636892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.636909 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.740171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.740241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.740258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.740282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.740300 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.746524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.746551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:05 crc kubenswrapper[4795]: E1205 08:25:05.746732 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:05 crc kubenswrapper[4795]: E1205 08:25:05.746863 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.843479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.843533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.843550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.843574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.843591 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.946258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.946748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.947009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.947235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:05 crc kubenswrapper[4795]: I1205 08:25:05.947427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:05Z","lastTransitionTime":"2025-12-05T08:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.050959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.051026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.051049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.051075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.051095 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.153675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.153728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.153743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.153764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.153778 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.257386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.257451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.257469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.257494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.257510 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.361381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.361429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.361448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.361476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.361499 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.464773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.464833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.464850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.464876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.464895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.568132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.568206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.568226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.568257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.568277 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.671933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.672300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.672732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.673138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.673353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.747043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:06 crc kubenswrapper[4795]: E1205 08:25:06.747813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.747601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:06 crc kubenswrapper[4795]: E1205 08:25:06.748489 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.777470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.777557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.777578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.777605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.777678 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.880872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.881297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.881540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.881826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.882068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.987295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.987866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.988120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.988325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:06 crc kubenswrapper[4795]: I1205 08:25:06.988487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:06Z","lastTransitionTime":"2025-12-05T08:25:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.091686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.092131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.092280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.092445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.092580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.195686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.195754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.195778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.195808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.195830 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.299354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.299442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.299460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.299484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.299501 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.403253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.403764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.403999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.404226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.404394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.508197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.508236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.508247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.508265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.508278 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.611694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.611772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.611797 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.611828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.611849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.715373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.715870 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.716048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.716211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.716370 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.746641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.746665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:07 crc kubenswrapper[4795]: E1205 08:25:07.747266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:07 crc kubenswrapper[4795]: E1205 08:25:07.747564 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.747892 4795 scope.go:117] "RemoveContainer" containerID="eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7" Dec 05 08:25:07 crc kubenswrapper[4795]: E1205 08:25:07.748274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.776378 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.797976 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.819199 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.821368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.821494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.821515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.821550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.821570 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.838607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.871988 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.890042 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.910877 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.925946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.926369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.926643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.926866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.927038 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:07Z","lastTransitionTime":"2025-12-05T08:25:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.933480 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.955752 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.976066 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:07 crc kubenswrapper[4795]: I1205 08:25:07.994994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:07Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.030754 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.055078 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.077761 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.097506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.113661 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.127345 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.134266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.134343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.134367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.134401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.134432 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.144699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.237426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.237557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.237584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.237669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.237694 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.340201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.340268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.340292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.340322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.340344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.453808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.453859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.453871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.453891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.453905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.557126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.557196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.557208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.557226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.557238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.660511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.660554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.660565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.660580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.660592 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.746726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.746741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:08 crc kubenswrapper[4795]: E1205 08:25:08.747425 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:08 crc kubenswrapper[4795]: E1205 08:25:08.747627 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.763994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.764445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.764567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.764747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.764835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.767267 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.781332 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.794720 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.811152 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.832604 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.849442 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.865601 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.868224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.868335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.868403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.868496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.868649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.877433 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.888995 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.900146 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.911872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.930754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.945249 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.957802 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.971798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.971866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.971889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.971919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.971945 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:08Z","lastTransitionTime":"2025-12-05T08:25:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.976403 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:08 crc kubenswrapper[4795]: I1205 08:25:08.994584 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:08Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.011023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:09Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.025535 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:09Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.074716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.074762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.074776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.074798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.074813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.178940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.178993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.179012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.179036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.179056 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.281151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.281186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.281196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.281211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.281222 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.386655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.386904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.386932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.386960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.386981 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.490883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.490950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.490979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.491014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.491038 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.595533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.595581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.595599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.595644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.595662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.699403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.699503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.699531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.699562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.699586 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.747232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.747299 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:09 crc kubenswrapper[4795]: E1205 08:25:09.747429 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:09 crc kubenswrapper[4795]: E1205 08:25:09.747647 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.802699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.802755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.802766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.802785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.802797 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.906015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.906072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.906085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.906106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:09 crc kubenswrapper[4795]: I1205 08:25:09.906126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:09Z","lastTransitionTime":"2025-12-05T08:25:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.010240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.010308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.010335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.010367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.010392 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.112564 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.112672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.112693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.112732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.112752 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.216090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.216161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.216181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.216209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.216230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.319487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.319604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.319868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.319917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.319943 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.422980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.423027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.423039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.423056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.423068 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.529449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.529516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.529535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.529565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.529590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.632262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.632314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.632331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.632367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.632386 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.734453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.734491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.734502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.734517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.734529 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.749040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:10 crc kubenswrapper[4795]: E1205 08:25:10.749161 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.749349 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:10 crc kubenswrapper[4795]: E1205 08:25:10.749422 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.836434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.836476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.836486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.836500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.836509 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.938862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.938923 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.938935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.938955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:10 crc kubenswrapper[4795]: I1205 08:25:10.938968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:10Z","lastTransitionTime":"2025-12-05T08:25:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.041843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.041888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.041902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.041928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.041940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.144770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.144814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.144826 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.144843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.144854 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.247459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.247606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.247670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.247695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.247752 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.351024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.351113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.351134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.351160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.351179 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.454368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.454420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.454436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.454460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.454477 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.556909 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.557196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.557288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.557366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.557439 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.659787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.659832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.659845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.659863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.659876 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.747171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.747233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:11 crc kubenswrapper[4795]: E1205 08:25:11.747578 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:11 crc kubenswrapper[4795]: E1205 08:25:11.747752 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.761631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.761666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.761677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.761693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.761705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.864384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.864419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.864429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.864442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.864451 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.966117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.966192 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.966202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.966224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:11 crc kubenswrapper[4795]: I1205 08:25:11.966234 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:11Z","lastTransitionTime":"2025-12-05T08:25:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.069784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.069837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.069847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.069861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.069871 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.172824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.172888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.172904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.172921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.172933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.275190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.275294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.275317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.275344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.275365 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.377252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.377289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.377300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.377317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.377329 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.479454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.479535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.479549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.479567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.479580 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.567055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.567108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.567122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.567142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.567159 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.580944 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:12Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.584496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.584661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.584732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.584806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.584875 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.595523 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:12Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.598605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.598700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.598714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.598732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.598756 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.610316 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:12Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.613118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.613146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.613155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.613187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.613196 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.638917 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:12Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.641883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.641908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.641916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.641934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.641943 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.654911 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:12Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.655050 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.658903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.658945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.658977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.658998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.659010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.748327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.748851 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.748383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:12 crc kubenswrapper[4795]: E1205 08:25:12.749206 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.760722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.761029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.761153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.761259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.761361 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.864218 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.864293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.864317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.864348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.864370 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.966197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.966258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.966271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.966287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:12 crc kubenswrapper[4795]: I1205 08:25:12.966300 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:12Z","lastTransitionTime":"2025-12-05T08:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.068304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.068337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.068351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.068365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.068375 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.171013 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.171041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.171068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.171081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.171091 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.273136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.273781 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.273996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.274182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.274320 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.376183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.376214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.376221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.376234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.376260 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.478713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.478971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.479073 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.479187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.479306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.582251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.582825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.582945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.583034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.583142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.685755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.686082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.686185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.686287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.686386 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.746337 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:13 crc kubenswrapper[4795]: E1205 08:25:13.746522 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.746658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:13 crc kubenswrapper[4795]: E1205 08:25:13.746876 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.790467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.790506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.790518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.790552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.790566 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.892953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.892997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.893007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.893021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.893034 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.995841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.996295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.996460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.996674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:13 crc kubenswrapper[4795]: I1205 08:25:13.996844 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:13Z","lastTransitionTime":"2025-12-05T08:25:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.100224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.100263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.100272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.100287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.100297 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.202541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.202588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.202597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.202642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.202652 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.305351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.305409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.305420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.305435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.305446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.407965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.407999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.408007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.408022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.408032 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.510011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.510070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.510078 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.510094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.510104 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.563540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:14 crc kubenswrapper[4795]: E1205 08:25:14.564480 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:25:14 crc kubenswrapper[4795]: E1205 08:25:14.564712 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:25:46.564676111 +0000 UTC m=+98.137279890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.613264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.613316 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.613329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.613347 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.613361 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.715983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.716297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.716398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.716499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.716590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.751826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:14 crc kubenswrapper[4795]: E1205 08:25:14.751978 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.752165 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:14 crc kubenswrapper[4795]: E1205 08:25:14.752245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.819028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.819061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.819071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.819084 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.819092 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.921280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.921315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.921323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.921337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:14 crc kubenswrapper[4795]: I1205 08:25:14.921346 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:14Z","lastTransitionTime":"2025-12-05T08:25:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.024185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.024252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.024276 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.024305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.024328 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.128932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.128966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.128974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.128986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.128995 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.230732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.231101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.231262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.231411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.231596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.334717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.334752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.334763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.334779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.334789 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.437532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.437599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.437660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.437690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.437711 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.540730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.540790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.540801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.540816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.540830 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.644758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.644790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.644798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.644811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.644820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.747445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.747507 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:15 crc kubenswrapper[4795]: E1205 08:25:15.747651 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:15 crc kubenswrapper[4795]: E1205 08:25:15.747853 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.748443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.748484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.748502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.748525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.748543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.851374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.851421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.851437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.851460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.851478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.955056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.955124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.955142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.955166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:15 crc kubenswrapper[4795]: I1205 08:25:15.955184 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:15Z","lastTransitionTime":"2025-12-05T08:25:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.057984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.058378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.058524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.058650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.058818 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.161018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.161087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.161105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.161129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.161147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.263768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.264114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.264264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.264409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.264534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.295708 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/0.log" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.295811 4795 generic.go:334] "Generic (PLEG): container finished" podID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" containerID="1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3" exitCode=1 Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.295905 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerDied","Data":"1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.296608 4795 scope.go:117] "RemoveContainer" containerID="1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.316485 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.333464 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.346744 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.359982 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.367686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.367745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.367768 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.367795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.367813 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.373872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.389284 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.405592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.417461 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.445934 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.459969 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.470391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.470422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.470431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.470444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.470453 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.476425 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.489672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.501434 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.511685 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.521992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.533034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.548398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.568504 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:16Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.572118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.572143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.572151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.572164 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.572173 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.675275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.675322 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.675340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.675363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.675380 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.746926 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.747069 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:16 crc kubenswrapper[4795]: E1205 08:25:16.747176 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:16 crc kubenswrapper[4795]: E1205 08:25:16.747502 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.777221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.777267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.777283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.777304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.777321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.880172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.880226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.880243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.880268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.880285 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.983277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.983555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.983718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.983849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:16 crc kubenswrapper[4795]: I1205 08:25:16.983944 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:16Z","lastTransitionTime":"2025-12-05T08:25:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.087148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.087545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.087775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.087910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.088040 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.190165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.190213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.190230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.190254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.190271 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.293099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.293149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.293165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.293188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.293206 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.301998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/0.log" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.302097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerStarted","Data":"927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.314060 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.342367 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.395857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.396122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.396204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.396311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.396386 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.396881 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.413054 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.425147 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.436731 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.447577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.462378 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.483385 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.493655 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.498644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.498759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.498824 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.498892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.498965 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.507095 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.520143 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.532690 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.543155 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.552722 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.564317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.577919 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.595735 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:17Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.600977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.601019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.601028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.601043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.601054 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.704156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.704196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.704205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.704220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.704228 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.746506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:17 crc kubenswrapper[4795]: E1205 08:25:17.746728 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.746506 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:17 crc kubenswrapper[4795]: E1205 08:25:17.747102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.806963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.807016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.807033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.807055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.807072 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.908902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.908970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.908987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.909014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:17 crc kubenswrapper[4795]: I1205 08:25:17.909030 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:17Z","lastTransitionTime":"2025-12-05T08:25:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.011693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.011946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.012029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.012112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.012192 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.114891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.115135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.115202 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.115273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.115330 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.218147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.218419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.218562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.218722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.218836 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.321289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.321565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.322001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.322090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.322183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.425171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.425219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.425231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.425248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.425261 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.528259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.528309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.528324 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.528346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.528359 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.631758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.631829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.631850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.631883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.631901 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.734157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.734220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.734240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.734275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.734297 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.746651 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.746679 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:18 crc kubenswrapper[4795]: E1205 08:25:18.746766 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:18 crc kubenswrapper[4795]: E1205 08:25:18.746880 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.761851 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.772097 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.789964 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.800876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.815246 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.831585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.835746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.835777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.835785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.835798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.835807 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.845203 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.854421 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.864138 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.875521 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.888149 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.904840 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.918437 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.932101 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.940566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.940595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.940603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.940635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.940644 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:18Z","lastTransitionTime":"2025-12-05T08:25:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.951972 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.964081 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.975545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:18 crc kubenswrapper[4795]: I1205 08:25:18.988822 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:18Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.050509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.050920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.051025 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.051156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.051256 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.153038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.153071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.153081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.153095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.153105 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.255473 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.255524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.255533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.255552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.255563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.358557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.358601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.358629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.358646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.358656 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.460962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.461028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.461040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.461062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.461073 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.563469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.563528 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.563541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.563562 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.563581 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.667063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.667116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.667127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.667149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.667161 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.746548 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.746594 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:19 crc kubenswrapper[4795]: E1205 08:25:19.746839 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:19 crc kubenswrapper[4795]: E1205 08:25:19.747010 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.770018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.770063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.770072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.770090 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.770099 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.872353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.872396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.872408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.872427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.872440 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.974403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.974445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.974455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.974496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:19 crc kubenswrapper[4795]: I1205 08:25:19.974561 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:19Z","lastTransitionTime":"2025-12-05T08:25:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.077603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.077659 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.077670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.077690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.077705 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.180719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.180772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.180788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.180809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.180822 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.283985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.284108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.284127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.284153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.284171 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.387199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.387265 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.387279 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.387305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.387319 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.490801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.490863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.490877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.490907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.490920 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.593633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.593674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.593686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.593701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.593714 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.696754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.696825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.696848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.696875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.696895 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.746823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:20 crc kubenswrapper[4795]: E1205 08:25:20.746976 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.746827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:20 crc kubenswrapper[4795]: E1205 08:25:20.747188 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.798948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.799009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.799019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.799041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.799053 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.901891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.901936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.901953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.901979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:20 crc kubenswrapper[4795]: I1205 08:25:20.901997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:20Z","lastTransitionTime":"2025-12-05T08:25:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.006216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.006287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.006307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.006333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.006351 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.108825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.108869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.108879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.108898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.108910 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.212065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.212132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.212151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.212555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.212608 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.315260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.315305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.315320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.315339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.315352 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.418071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.418141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.418160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.418186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.418204 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.521387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.521445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.521463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.521486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.521502 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.623823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.623865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.623876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.623895 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.623907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.727436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.727503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.727523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.727543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.727555 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.747061 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.747150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:21 crc kubenswrapper[4795]: E1205 08:25:21.747232 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:21 crc kubenswrapper[4795]: E1205 08:25:21.747296 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.748852 4795 scope.go:117] "RemoveContainer" containerID="eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.829838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.829928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.829949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.829983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.830008 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.932856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.932904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.932919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.932938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:21 crc kubenswrapper[4795]: I1205 08:25:21.932953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:21Z","lastTransitionTime":"2025-12-05T08:25:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.035928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.035965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.035973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.035987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.035997 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.138510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.138560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.138575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.138593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.138607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.241699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.241753 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.241766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.241785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.241797 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.319133 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/2.log" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.321917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.322332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.344288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.344324 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.344331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.344344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.344353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.347983 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.369262 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.390306 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.404783 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.427224 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.440760 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.452798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.452853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.452866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.452887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.452901 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.460801 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.476259 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.488826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.505150 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.516755 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.535086 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.548196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.554556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.554578 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.554586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.554599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.554607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.564236 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.578975 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.593586 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.606314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.621317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.656712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.656750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.656762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.656778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.656790 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.747136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.747248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.747307 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.747473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.758816 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.758851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.758861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.758903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.758917 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.859779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.859829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.859844 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.859863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.859880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.877952 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.884166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.884216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.884234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.884260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.884277 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.908322 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.914284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.914333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.914363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.914387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.914409 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.933792 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.938213 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.938301 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.938320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.938345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.938363 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.953346 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.958605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.958684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.958703 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.958726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.958743 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.977587 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:22 crc kubenswrapper[4795]: E1205 08:25:22.977918 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.979697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.979756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.979774 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.979800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:22 crc kubenswrapper[4795]: I1205 08:25:22.979820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:22Z","lastTransitionTime":"2025-12-05T08:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.083046 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.083124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.083148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.083175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.083193 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.187003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.187066 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.187083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.187108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.187126 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.289899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.289961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.289978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.290003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.290021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.393960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.394017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.394029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.394049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.394066 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.498083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.498131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.498148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.498171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.498189 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.601205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.601281 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.601299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.601325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.601342 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.704468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.704537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.704559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.704588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.704650 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.746702 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.746879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:23 crc kubenswrapper[4795]: E1205 08:25:23.747081 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:23 crc kubenswrapper[4795]: E1205 08:25:23.747234 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.808522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.808583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.808602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.808675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.808696 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.919918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.919983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.920004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.920028 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:23 crc kubenswrapper[4795]: I1205 08:25:23.920047 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:23Z","lastTransitionTime":"2025-12-05T08:25:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.024156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.024224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.024243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.024269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.024287 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.130779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.130850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.130868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.130893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.130911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.234769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.235323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.235526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.235695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.235835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.332190 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/3.log" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.333358 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/2.log" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.338156 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" exitCode=1 Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.338297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.338509 4795 scope.go:117] "RemoveContainer" containerID="eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.339496 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:25:24 crc kubenswrapper[4795]: E1205 08:25:24.339925 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.340770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.340897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.340993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.341759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.341795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.363991 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.388726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.407953 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.425141 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.445305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.445345 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.445358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.445374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.445386 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.446924 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.471324 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.489280 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.507545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.524795 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.545582 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.549360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.549424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.549441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.549470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.549487 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.567122 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.584749 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.601554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.616228 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.630514 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.643152 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.657539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.657566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.657575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.657587 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.657596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.664903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:23Z\\\",\\\"message\\\":\\\"ble to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z]\\\\nI1205 08:25:22.657706 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657716 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657724 6703 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1205 08:25:22.657727 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1205 08:25:22.657744 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/ipta\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.680956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:24Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.747171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:24 crc kubenswrapper[4795]: E1205 08:25:24.747378 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.747477 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:24 crc kubenswrapper[4795]: E1205 08:25:24.747731 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.760398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.760466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.760488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.760519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.760537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.864668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.864748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.864769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.864803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.864829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.969383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.969466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.969488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.969518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:24 crc kubenswrapper[4795]: I1205 08:25:24.969533 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:24Z","lastTransitionTime":"2025-12-05T08:25:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.072639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.072699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.072711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.072731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.072744 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.179000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.179062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.179085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.179130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.179148 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.282569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.282685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.282706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.282782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.282854 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.347568 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/3.log" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.385379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.385447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.385464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.385490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.385514 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.489385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.489474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.489493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.489520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.489538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.600005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.600064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.600081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.600103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.600119 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.702399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.702435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.702447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.702465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.702475 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.746659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:25 crc kubenswrapper[4795]: E1205 08:25:25.746789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.746655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:25 crc kubenswrapper[4795]: E1205 08:25:25.747099 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.805485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.805525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.805536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.805551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.805563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.908603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.909172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.909373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.909573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:25 crc kubenswrapper[4795]: I1205 08:25:25.909815 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:25Z","lastTransitionTime":"2025-12-05T08:25:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.013389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.013460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.013471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.013489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.013499 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.116832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.117366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.117591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.117822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.117945 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.220911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.220978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.220999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.221026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.221047 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.324762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.324831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.324849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.324873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.324894 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.427775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.427837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.427857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.428023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.428047 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.532072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.532133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.532150 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.532175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.532195 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.635809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.635901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.635917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.635947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.635967 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.739929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.740323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.740469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.740699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.740845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.746456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.746639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:26 crc kubenswrapper[4795]: E1205 08:25:26.746701 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:26 crc kubenswrapper[4795]: E1205 08:25:26.746868 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.844580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.845806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.846030 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.846208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.846372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.949149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.949459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.949662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.949771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:26 crc kubenswrapper[4795]: I1205 08:25:26.949867 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:26Z","lastTransitionTime":"2025-12-05T08:25:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.052436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.052495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.052512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.052536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.052554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.155589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.155695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.155717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.155746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.155769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.259225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.259290 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.259306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.259329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.259345 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.361788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.361853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.361872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.361899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.361918 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.466080 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.466148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.466167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.466232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.466251 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.568866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.569433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.569540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.569682 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.569817 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.673965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.674376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.674503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.674634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.674732 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.746944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:27 crc kubenswrapper[4795]: E1205 08:25:27.747119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.747510 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:27 crc kubenswrapper[4795]: E1205 08:25:27.747801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.779573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.779697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.779726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.779763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.779790 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.883627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.884015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.884031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.884050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.884065 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.986683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.986932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.987026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.987114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:27 crc kubenswrapper[4795]: I1205 08:25:27.987180 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:27Z","lastTransitionTime":"2025-12-05T08:25:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.090446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.090500 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.090522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.090546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.090563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.194245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.194320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.194341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.194369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.194390 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.298194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.298263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.298283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.298309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.298327 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.401065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.401120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.401137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.401161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.401178 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.504548 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.504649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.504675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.504706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.504731 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.607524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.607588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.607610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.607679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.607700 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.711212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.711346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.711374 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.711403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.711425 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.747175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.747180 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:28 crc kubenswrapper[4795]: E1205 08:25:28.747383 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:28 crc kubenswrapper[4795]: E1205 08:25:28.747568 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.766126 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.790172 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.811110 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.815765 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.815831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.815856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.815889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.815912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.833091 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.851528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.884188 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca87fcc77104e97b9d8117d07a4218e199d8baf2a420d0576be4562cbf40ab7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:24:55Z\\\",\\\"message\\\":\\\"AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:9001, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.43\\\\\\\", Port:8798, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1205 08:24:55.594682 6369 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1205 08:24:55.594690 6369 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1205 08:24:55.594701 6369 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1205 08:24:55.594732 6369 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}\\\\nI1205 08:24:55.594756 6369 services_controller.go:454] Service openshift-service-ca-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1205 08:24:55.594775 6369 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:23Z\\\",\\\"message\\\":\\\"ble to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z]\\\\nI1205 08:25:22.657706 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657716 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657724 6703 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1205 08:25:22.657727 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1205 08:25:22.657744 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/ipta\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:25:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.907562 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.918359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.918438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.918457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.918483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.918500 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:28Z","lastTransitionTime":"2025-12-05T08:25:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.928071 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.944857 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.959971 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:28 crc kubenswrapper[4795]: I1205 08:25:28.976962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:28Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.012760 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.021531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.021568 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.021579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.021596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.021629 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.037278 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.059718 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.082431 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.102153 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.124959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.125017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.125036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.125062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.125080 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.125548 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.148787 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:29Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.228498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.228558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.228575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.228603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.228692 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.331058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.331115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.331132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.331156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.331174 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.433970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.434056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.434083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.434114 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.434138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.538098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.538182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.538207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.538237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.538260 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.641544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.641592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.641639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.641664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.641681 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.744431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.744480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.744493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.744510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.744522 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.746412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:29 crc kubenswrapper[4795]: E1205 08:25:29.746547 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.746756 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:29 crc kubenswrapper[4795]: E1205 08:25:29.746828 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.847310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.847399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.847426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.847462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.847488 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.952598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.952714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.952733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.952762 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:29 crc kubenswrapper[4795]: I1205 08:25:29.952780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:29Z","lastTransitionTime":"2025-12-05T08:25:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.056393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.056467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.056491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.056520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.056542 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.159196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.159264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.159283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.159306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.159323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.263086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.263128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.263146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.263168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.263185 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.365974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.366062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.366098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.366134 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.366158 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.469893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.469957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.469969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.469996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.470009 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.574882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.574953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.574965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.574992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.575005 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.678234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.678286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.678299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.678319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.678332 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.747012 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.747111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:30 crc kubenswrapper[4795]: E1205 08:25:30.747232 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:30 crc kubenswrapper[4795]: E1205 08:25:30.747439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.780842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.780890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.780901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.780919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.780932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.884049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.884158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.884175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.884197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.884211 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.987153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.987220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.987237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.987260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:30 crc kubenswrapper[4795]: I1205 08:25:30.987273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:30Z","lastTransitionTime":"2025-12-05T08:25:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.090698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.090785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.090801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.090829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.090848 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.194465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.194529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.194547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.194571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.194589 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.297988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.298058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.298075 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.298102 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.298121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.401112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.401187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.401207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.401235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.401254 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.504873 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.504942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.504961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.504988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.505008 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.607511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.607555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.607566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.607581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.607593 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.700693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.700907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.700960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701017 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.700958754 +0000 UTC m=+147.273562593 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701114 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701135 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.701133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701151 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.701255 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701338 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.701315834 +0000 UTC m=+147.273919583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701164 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701186 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701526 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.701501109 +0000 UTC m=+147.274104878 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701539 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.701542631 +0000 UTC m=+147.274146360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701585 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701652 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.701727 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.701703775 +0000 UTC m=+147.274307744 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.710788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.710840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.710857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.710882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.710900 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.746748 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.746769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.747082 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:31 crc kubenswrapper[4795]: E1205 08:25:31.747248 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.814072 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.814143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.814171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.814209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.814230 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.916786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.916834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.916847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.916867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:31 crc kubenswrapper[4795]: I1205 08:25:31.916880 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:31Z","lastTransitionTime":"2025-12-05T08:25:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.020521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.020599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.020651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.020686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.020710 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.123601 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.123700 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.123722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.123757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.123780 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.228208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.228287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.228306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.228334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.228353 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.332334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.332410 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.332429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.332455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.332475 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.443917 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.444019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.444047 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.444082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.444105 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.547512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.547596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.547679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.547719 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.547747 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.651097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.651431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.651530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.651653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.651751 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.747164 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.747238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:32 crc kubenswrapper[4795]: E1205 08:25:32.747432 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:32 crc kubenswrapper[4795]: E1205 08:25:32.747851 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.754594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.754657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.754670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.754685 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.754698 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.857035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.857098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.857160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.857196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.857219 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.960221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.960306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.960328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.960356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:32 crc kubenswrapper[4795]: I1205 08:25:32.960379 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:32Z","lastTransitionTime":"2025-12-05T08:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.063733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.063814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.063834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.063861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.063881 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.167092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.167167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.167186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.167215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.167238 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.270023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.270060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.270071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.270085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.270095 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.343813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.343893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.343921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.343952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.343977 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.361205 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.366441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.366499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.366511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.366533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.366551 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.382638 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.386792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.386866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.386881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.387180 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.387221 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.406559 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.411804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.411888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.411916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.411967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.411993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.428704 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.434458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.434531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.434546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.434577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.434597 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.453323 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148072Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608872Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0308ff11-c3b4-4b0c-9b5f-7f47ac118fdb\\\",\\\"systemUUID\\\":\\\"57d745a0-49a1-4146-a982-16c31b0a2ce8\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:33Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.453501 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.455911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.455980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.455995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.456014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.456028 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.559433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.559539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.559559 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.559586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.559606 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.662082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.662119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.662130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.662147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.662160 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.747234 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.747320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.747451 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:33 crc kubenswrapper[4795]: E1205 08:25:33.747796 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.765001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.765041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.765051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.765068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.765080 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.868286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.868357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.868373 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.868399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.868419 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.971668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.971740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.971759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.971787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:33 crc kubenswrapper[4795]: I1205 08:25:33.971811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:33Z","lastTransitionTime":"2025-12-05T08:25:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.076879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.076967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.076989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.077019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.077042 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.181091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.181657 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.181680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.181709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.181727 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.284681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.284796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.284818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.284845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.284864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.388813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.388890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.388916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.388944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.388968 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.495767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.495828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.495850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.495887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.495906 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.599819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.599916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.599937 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.599964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.599982 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.703670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.703718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.703730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.703748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.703760 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.747490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.747584 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:34 crc kubenswrapper[4795]: E1205 08:25:34.747943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:34 crc kubenswrapper[4795]: E1205 08:25:34.748080 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.806376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.806418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.806428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.806443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.806454 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.908828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.908882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.908894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.908912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:34 crc kubenswrapper[4795]: I1205 08:25:34.908924 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:34Z","lastTransitionTime":"2025-12-05T08:25:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.011437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.011494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.011504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.011517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.011528 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.114792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.114863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.114874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.114912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.114933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.217977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.218061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.218087 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.218120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.218147 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.321547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.321642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.321665 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.321693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.321711 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.424879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.424953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.424976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.425005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.425030 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.528707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.529118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.529296 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.529466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.529710 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.633341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.633384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.633396 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.633412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.633424 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.736302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.736369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.736391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.736423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.736446 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.746679 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.746916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:35 crc kubenswrapper[4795]: E1205 08:25:35.747737 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:35 crc kubenswrapper[4795]: E1205 08:25:35.747931 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.748504 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:25:35 crc kubenswrapper[4795]: E1205 08:25:35.748794 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.764799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.781736 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07841d71-c3fd-4311-8a61-758c5b7e645a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://093016e29f5b054c08f674808ccf04ba29248afa9f680d7283914816f7a88dee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76cf11ef4541ce292c74155a3404cf8a57ac63793bf4242d7a6066c06df5579c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6633ec92c763795e606f0d536e55b7d027b670acce182dff71eb93bf0f95777\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c39da4b6b04b20daf5296cc4e73cd29fb5014140c2e5aa3d4979497664b7f476\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41751bba2a08d8053ca28c9a328befb386eec55e63e2b980975e9b1bf16496b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ebeccd524637c60f084ba0c8f76c9fc7dde0b3cb856596f1e6b1a8ffbcc81fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13009b298ccee686c27e02fa8948f560be4e5408a024c2a4933b40cbdf0034b3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3260112996ec2fe357d5efda726fae4fa091104a2f67b8df40b08fbe95e41505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.805602 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rns2q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3273f819-71fb-4fdc-8869-dc3b787f4592\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43635742e06ebf022c18bdc65ec9ae1bba34b8c6cff5a3d8ce3583ffaab903e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b166a1c434b7e5e7518873e4b1ab4eacccd64160b0a8c938e52fad1f8b3e5b36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb701b58eb072a047ff4c6303b64172e2698b61c286cd2e3fa3885fa5fc16a9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4f2ce74a29b6d4ee1f959130959eda94805839f48f23c0118954a3e66c94922\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f9a0be85566fcbbe84709dd95b539097e1a53acae2bf2480fae8da42e3c426\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70b206fc1b97d86bcbe6c03ae52c8abb1cf31a4b2c4fd97328478ba630a50cba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cab0ef60a26458b5adfe1104bb61ac400ca9663d4f6a7cb0ba89b3273509d7c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z4q2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rns2q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.824876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbb9b2fbdbd0026408ac5b545b7bdffe7fcc5aa023639986364af51e86419988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69614b7af8bc92e83834fb6eb583984904d0167ca6e2ef56ef304922e407ee8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.839651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.839714 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.839732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.839758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.839777 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.845488 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.864577 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b7c3d7c557a6d16000fd7717a878ee80bc0c51d388fcc2a284f6cb1010af80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.884964 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://633192e084d65843a1f293149acd6b7c2975b16e4a28d3055a01d929784d4bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.904541 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.920511 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zmscs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c390ffe7-ac55-487a-aabd-6e0a3245c6d8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf9d4ef258d177fe60e393f508c7a887bedf133712a5c49e35e68b9e256a25a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqr87\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zmscs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.943796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.943880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.943901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.943935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.943954 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:35Z","lastTransitionTime":"2025-12-05T08:25:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.953847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cfece70d-6476-4442-bcc6-8ee82a8330c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:23Z\\\",\\\"message\\\":\\\"ble to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:22Z is after 2025-08-24T17:21:41Z]\\\\nI1205 08:25:22.657706 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657716 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1205 08:25:22.657724 6703 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1205 08:25:22.657727 6703 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1205 08:25:22.657744 6703 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/ipta\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:25:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-78k86\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xl8v5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.972820 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c9f96ec-f615-4030-a78d-2dd56932c6c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rwhlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8cnbm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:35 crc kubenswrapper[4795]: I1205 08:25:35.993971 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"42252db7-6e43-427a-9257-3516071eb545\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1205 08:24:21.668823 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1205 08:24:21.670558 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-976009832/tls.crt::/tmp/serving-cert-976009832/tls.key\\\\\\\"\\\\nI1205 08:24:27.408856 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1205 08:24:27.410861 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1205 08:24:27.410882 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1205 08:24:27.410901 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1205 08:24:27.410907 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1205 08:24:27.426549 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1205 08:24:27.426575 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426580 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1205 08:24:27.426594 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1205 08:24:27.426598 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1205 08:24:27.426602 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1205 08:24:27.426605 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1205 08:24:27.426782 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1205 08:24:27.436244 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:35Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.013992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba3535cb-53c2-44f7-9f71-b966398912db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71e376b1c35ef3c73fa38de6109b5d40a040ed36f47a7cc45f82ffd041d7d26d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f021c64b890d49a3e6ad3136c7d55bba865f17b87240a59fef526e7bff78692a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00310fa4458fc230470ad0c038ced24ca22c269a014151596b60c80237cc4d28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://30c41374c7d4dfaa1cca6251c61298943ac9447bf76b933f434cd6536c553fb9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-05T08:24:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.035906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"23494e8d-0824-46a2-9b0c-c447f1d5e5d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1496eb9b4c85d2ac0702a04927675b2c5e13784f5170d4d00215c88e4af51aad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7cxmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t68zt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.047224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.047508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.047693 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.047904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.048053 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.056654 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-nw8pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b8a3269-c30f-4b78-b300-dbb66ed703b8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a2e5462408ef38afe2136d5d899811a75558333a719bf70ab87a96f8b3943c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ppq2n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-nw8pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.077347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"993c8d73-2e31-4128-95a9-db06e34b8de1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e439a428481c1ef705cbba6f3b23b4bbf9afca2dcae4232c74470f793dc4dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31c9330cf28288ae10a158fcaec2985a992f6feef999578ce4329a613b33c4d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rwt9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xm244\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.098079 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4c0f8a9-ece1-4501-8f54-4411b04e959e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6411a6ae0c385bf8d4b69bcfa67723ba6296c2f115eb4089950ef221453deb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf96a96c646cfcfc3c2680bee0cbdc579efd4d9a75001d8872288d7c4439b867\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ab5150624661c935ef9b66156ed7d529b4ec4d38aac8f5470fd6ce9a62fdf83\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:24:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:08Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.118470 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.140243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bhxnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:24:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-05T08:25:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-05T08:25:15Z\\\",\\\"message\\\":\\\"2025-12-05T08:24:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3\\\\n2025-12-05T08:24:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8b20d17-f002-4a3b-aa6f-f62a05e4c2b3 to /host/opt/cni/bin/\\\\n2025-12-05T08:24:30Z [verbose] multus-daemon started\\\\n2025-12-05T08:24:30Z [verbose] Readiness Indicator file check\\\\n2025-12-05T08:25:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-05T08:24:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-05T08:25:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95jqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-05T08:24:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bhxnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-05T08:25:36Z is after 2025-08-24T17:21:41Z" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.151143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.151184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.151200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.151224 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.151242 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.254980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.255048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.255061 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.255085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.255100 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.358167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.358232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.358242 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.358264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.358281 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.461526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.461583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.461596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.461637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.461652 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.564448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.564512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.564521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.564537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.564546 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.667825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.667900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.667925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.667954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.668001 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.747229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:36 crc kubenswrapper[4795]: E1205 08:25:36.747462 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.747788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:36 crc kubenswrapper[4795]: E1205 08:25:36.747899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.771369 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.771429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.771449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.771472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.771491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.875111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.875203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.875235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.875275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.875303 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.978975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.979085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.979104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.979129 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:36 crc kubenswrapper[4795]: I1205 08:25:36.979146 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:36Z","lastTransitionTime":"2025-12-05T08:25:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.081927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.081994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.082008 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.082024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.082039 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.185337 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.185391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.185405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.185428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.185442 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.289303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.289387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.289407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.289433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.289450 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.392508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.392681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.392711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.392744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.392769 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.495934 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.496007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.496019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.496068 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.496085 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.599605 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.599732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.599755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.599783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.599806 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.702678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.703053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.703208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.703361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.703504 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.746995 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.747062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:37 crc kubenswrapper[4795]: E1205 08:25:37.747238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:37 crc kubenswrapper[4795]: E1205 08:25:37.747412 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.807329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.807697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.807802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.807893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.807993 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.910452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.910803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.910898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.910992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:37 crc kubenswrapper[4795]: I1205 08:25:37.911077 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:37Z","lastTransitionTime":"2025-12-05T08:25:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.022925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.023256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.023349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.023450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.023536 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.126463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.126535 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.126557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.126589 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.126647 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.229747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.229793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.229805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.229821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.229834 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.332740 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.332780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.332790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.332806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.332818 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.435813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.435900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.435918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.435946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.435971 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.539716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.539766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.539786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.539809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.539825 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.672878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.672948 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.672961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.672985 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.673007 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.746386 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.746386 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:38 crc kubenswrapper[4795]: E1205 08:25:38.747545 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:38 crc kubenswrapper[4795]: E1205 08:25:38.747971 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.772716 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nw8pr" podStartSLOduration=71.772694799 podStartE2EDuration="1m11.772694799s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.770544617 +0000 UTC m=+90.343148356" watchObservedRunningTime="2025-12-05 08:25:38.772694799 +0000 UTC m=+90.345298538" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.777021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.777160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.777189 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.777227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.777265 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.792492 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xm244" podStartSLOduration=70.792463721 podStartE2EDuration="1m10.792463721s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.792367119 +0000 UTC m=+90.364970898" watchObservedRunningTime="2025-12-05 08:25:38.792463721 +0000 UTC m=+90.365067490" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.820360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.820337888 podStartE2EDuration="1m10.820337888s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.820312308 +0000 UTC m=+90.392916087" watchObservedRunningTime="2025-12-05 08:25:38.820337888 +0000 UTC m=+90.392941637" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.879575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.879633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.879674 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.879689 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.879699 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.910505 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bhxnf" podStartSLOduration=71.910481688 podStartE2EDuration="1m11.910481688s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.870901242 +0000 UTC m=+90.443504981" watchObservedRunningTime="2025-12-05 08:25:38.910481688 +0000 UTC m=+90.483085447" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.911327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.911316652 podStartE2EDuration="1m11.911316652s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.909605043 +0000 UTC m=+90.482208802" watchObservedRunningTime="2025-12-05 08:25:38.911316652 +0000 UTC m=+90.483920411" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.967533 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rns2q" podStartSLOduration=71.9675136 podStartE2EDuration="1m11.9675136s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:38.934271977 +0000 UTC m=+90.506875736" watchObservedRunningTime="2025-12-05 08:25:38.9675136 +0000 UTC m=+90.540117339" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.983303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.983334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.983355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.983378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:38 crc kubenswrapper[4795]: I1205 08:25:38.983391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:38Z","lastTransitionTime":"2025-12-05T08:25:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.033835 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.033817129 podStartE2EDuration="4.033817129s" podCreationTimestamp="2025-12-05 08:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:39.01000303 +0000 UTC m=+90.582606769" watchObservedRunningTime="2025-12-05 08:25:39.033817129 +0000 UTC m=+90.606420878" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.085171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.085208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.085217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.085230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.085239 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.126878 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zmscs" podStartSLOduration=72.126862572 podStartE2EDuration="1m12.126862572s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:39.095828814 +0000 UTC m=+90.668432553" watchObservedRunningTime="2025-12-05 08:25:39.126862572 +0000 UTC m=+90.699466301" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.174255 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.174236374 podStartE2EDuration="45.174236374s" podCreationTimestamp="2025-12-05 08:24:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:39.174139421 +0000 UTC m=+90.746743160" watchObservedRunningTime="2025-12-05 08:25:39.174236374 +0000 UTC m=+90.746840123" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.174712 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.174707767 podStartE2EDuration="1m12.174707767s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:39.161015722 +0000 UTC m=+90.733619491" watchObservedRunningTime="2025-12-05 08:25:39.174707767 +0000 UTC m=+90.747311506" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187341 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podStartSLOduration=72.187325013 podStartE2EDuration="1m12.187325013s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:39.185791988 +0000 UTC m=+90.758395727" watchObservedRunningTime="2025-12-05 08:25:39.187325013 +0000 UTC m=+90.759928752" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.187772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.290913 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.290950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.290960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.290974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.290984 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.394420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.394484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.394499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.394522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.394537 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.497161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.497204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.497215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.497230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.497240 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.600071 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.600124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.600135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.600152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.600167 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.703020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.703077 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.703095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.703119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.703138 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.746905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.746990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:39 crc kubenswrapper[4795]: E1205 08:25:39.747070 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:39 crc kubenswrapper[4795]: E1205 08:25:39.747144 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.807216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.807295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.807313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.807343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.807362 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.911184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.911252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.911270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.911294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:39 crc kubenswrapper[4795]: I1205 08:25:39.911313 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:39Z","lastTransitionTime":"2025-12-05T08:25:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.014666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.014789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.014854 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.014888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.014912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.117944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.117989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.118012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.118034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.118047 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.221002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.221064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.221081 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.221104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.221122 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.324442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.324563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.324584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.324609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.324654 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.427940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.428007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.428029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.428055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.428076 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.531258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.531344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.531372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.531404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.531426 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.635448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.635524 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.635543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.635570 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.635588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.738539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.738672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.738705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.738731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.738750 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.746912 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.747093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:40 crc kubenswrapper[4795]: E1205 08:25:40.747275 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:40 crc kubenswrapper[4795]: E1205 08:25:40.747474 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.842048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.842098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.842113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.842135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.842149 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.944916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.944980 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.944997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.945020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:40 crc kubenswrapper[4795]: I1205 08:25:40.945038 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:40Z","lastTransitionTime":"2025-12-05T08:25:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.047803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.047878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.047902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.047931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.047955 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.150975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.151010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.151019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.151031 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.151041 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.253591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.253684 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.253702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.253724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.253739 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.357640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.357695 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.357708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.357726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.357738 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.460966 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.461023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.461040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.461065 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.461084 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.563262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.563346 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.563372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.563402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.563424 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.665886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.665941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.665953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.665970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.665982 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.746928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.746941 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:41 crc kubenswrapper[4795]: E1205 08:25:41.747068 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:41 crc kubenswrapper[4795]: E1205 08:25:41.747185 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.768341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.768381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.768389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.768402 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.768412 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.870678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.870722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.870734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.870759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.870772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.975488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.975530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.975541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.975557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:41 crc kubenswrapper[4795]: I1205 08:25:41.975569 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:41Z","lastTransitionTime":"2025-12-05T08:25:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.078509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.078698 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.078716 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.078736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.078748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.181724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.181759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.181767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.181782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.181792 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.285485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.285521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.285529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.285544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.285553 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.390549 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.390656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.390679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.390708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.390729 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.492995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.493054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.493067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.493085 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.493101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.595480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.595540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.595557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.595581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.595602 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.698519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.698899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.699036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.699183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.699366 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.746544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.746598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:42 crc kubenswrapper[4795]: E1205 08:25:42.746741 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:42 crc kubenswrapper[4795]: E1205 08:25:42.746849 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.802982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.803049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.803067 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.803092 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.803111 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.907195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.907263 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.907286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.907314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:42 crc kubenswrapper[4795]: I1205 08:25:42.907340 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:42Z","lastTransitionTime":"2025-12-05T08:25:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.009634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.009680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.009690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.009706 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.009716 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.111865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.111915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.111929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.111946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.111959 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.216070 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.216138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.216153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.216174 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.216191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.319106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.319191 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.319216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.319247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.319274 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.422664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.422747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.422773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.422808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.422829 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.525977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.526024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.526040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.526069 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.526086 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.629872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.629940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.629960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.629988 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.630010 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.732959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.733012 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.733024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.733041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.733052 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.746649 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:43 crc kubenswrapper[4795]: E1205 08:25:43.746787 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.746656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:43 crc kubenswrapper[4795]: E1205 08:25:43.746991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.804505 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.804686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.804708 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.804731 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.804748 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-05T08:25:43Z","lastTransitionTime":"2025-12-05T08:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.867440 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79"] Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.867922 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.871412 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.873167 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.873941 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.876309 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.961343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc39ada7-5695-484b-9144-f285c10d7fb4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.961764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.962009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc39ada7-5695-484b-9144-f285c10d7fb4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.962249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc39ada7-5695-484b-9144-f285c10d7fb4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:43 crc kubenswrapper[4795]: I1205 08:25:43.962505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc39ada7-5695-484b-9144-f285c10d7fb4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063202 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc39ada7-5695-484b-9144-f285c10d7fb4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc39ada7-5695-484b-9144-f285c10d7fb4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.063878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bc39ada7-5695-484b-9144-f285c10d7fb4-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.065454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc39ada7-5695-484b-9144-f285c10d7fb4-service-ca\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.072050 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc39ada7-5695-484b-9144-f285c10d7fb4-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.080499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc39ada7-5695-484b-9144-f285c10d7fb4-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-f8h79\" (UID: \"bc39ada7-5695-484b-9144-f285c10d7fb4\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.196844 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" Dec 05 08:25:44 crc kubenswrapper[4795]: W1205 08:25:44.223235 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc39ada7_5695_484b_9144_f285c10d7fb4.slice/crio-d239bf3115081ef606af7a7f371e2306e40206a19796b22fdd28c5183c77656a WatchSource:0}: Error finding container d239bf3115081ef606af7a7f371e2306e40206a19796b22fdd28c5183c77656a: Status 404 returned error can't find the container with id d239bf3115081ef606af7a7f371e2306e40206a19796b22fdd28c5183c77656a Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.431918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" event={"ID":"bc39ada7-5695-484b-9144-f285c10d7fb4","Type":"ContainerStarted","Data":"d239bf3115081ef606af7a7f371e2306e40206a19796b22fdd28c5183c77656a"} Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.746569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:44 crc kubenswrapper[4795]: I1205 08:25:44.746664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:44 crc kubenswrapper[4795]: E1205 08:25:44.747034 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:44 crc kubenswrapper[4795]: E1205 08:25:44.747244 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:45 crc kubenswrapper[4795]: I1205 08:25:45.439001 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" event={"ID":"bc39ada7-5695-484b-9144-f285c10d7fb4","Type":"ContainerStarted","Data":"eee5c6900285870b152bb78b30bc885355505dc7e2e6107338324326a113733d"} Dec 05 08:25:45 crc kubenswrapper[4795]: I1205 08:25:45.746598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:45 crc kubenswrapper[4795]: I1205 08:25:45.746704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:45 crc kubenswrapper[4795]: E1205 08:25:45.747392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:45 crc kubenswrapper[4795]: E1205 08:25:45.747394 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:46 crc kubenswrapper[4795]: I1205 08:25:46.596791 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:46 crc kubenswrapper[4795]: E1205 08:25:46.597027 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:25:46 crc kubenswrapper[4795]: E1205 08:25:46.597148 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs podName:6c9f96ec-f615-4030-a78d-2dd56932c6c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:26:50.597115186 +0000 UTC m=+162.169718955 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs") pod "network-metrics-daemon-8cnbm" (UID: "6c9f96ec-f615-4030-a78d-2dd56932c6c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 05 08:25:46 crc kubenswrapper[4795]: I1205 08:25:46.746580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:46 crc kubenswrapper[4795]: I1205 08:25:46.746707 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:46 crc kubenswrapper[4795]: E1205 08:25:46.746786 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:46 crc kubenswrapper[4795]: E1205 08:25:46.746875 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:47 crc kubenswrapper[4795]: I1205 08:25:47.747399 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:47 crc kubenswrapper[4795]: I1205 08:25:47.747579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:47 crc kubenswrapper[4795]: E1205 08:25:47.747696 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:47 crc kubenswrapper[4795]: E1205 08:25:47.747866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:48 crc kubenswrapper[4795]: I1205 08:25:48.747155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:48 crc kubenswrapper[4795]: E1205 08:25:48.748373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:48 crc kubenswrapper[4795]: I1205 08:25:48.748567 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:48 crc kubenswrapper[4795]: E1205 08:25:48.749222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:48 crc kubenswrapper[4795]: I1205 08:25:48.749415 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:25:48 crc kubenswrapper[4795]: E1205 08:25:48.749997 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xl8v5_openshift-ovn-kubernetes(cfece70d-6476-4442-bcc6-8ee82a8330c1)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" Dec 05 08:25:49 crc kubenswrapper[4795]: I1205 08:25:49.747153 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:49 crc kubenswrapper[4795]: I1205 08:25:49.747251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:49 crc kubenswrapper[4795]: E1205 08:25:49.747373 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:49 crc kubenswrapper[4795]: E1205 08:25:49.748045 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:50 crc kubenswrapper[4795]: I1205 08:25:50.747460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:50 crc kubenswrapper[4795]: I1205 08:25:50.747493 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:50 crc kubenswrapper[4795]: E1205 08:25:50.747764 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:50 crc kubenswrapper[4795]: E1205 08:25:50.747912 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:51 crc kubenswrapper[4795]: I1205 08:25:51.746996 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:51 crc kubenswrapper[4795]: I1205 08:25:51.747009 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:51 crc kubenswrapper[4795]: E1205 08:25:51.747171 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:51 crc kubenswrapper[4795]: E1205 08:25:51.747313 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:52 crc kubenswrapper[4795]: I1205 08:25:52.746552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:52 crc kubenswrapper[4795]: I1205 08:25:52.746552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:52 crc kubenswrapper[4795]: E1205 08:25:52.746789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:52 crc kubenswrapper[4795]: E1205 08:25:52.746906 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:53 crc kubenswrapper[4795]: I1205 08:25:53.747330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:53 crc kubenswrapper[4795]: E1205 08:25:53.747523 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:53 crc kubenswrapper[4795]: I1205 08:25:53.747329 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:53 crc kubenswrapper[4795]: E1205 08:25:53.747874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:54 crc kubenswrapper[4795]: I1205 08:25:54.746874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:54 crc kubenswrapper[4795]: I1205 08:25:54.746978 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:54 crc kubenswrapper[4795]: E1205 08:25:54.747104 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:54 crc kubenswrapper[4795]: E1205 08:25:54.747213 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:55 crc kubenswrapper[4795]: I1205 08:25:55.747093 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:55 crc kubenswrapper[4795]: I1205 08:25:55.747309 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:55 crc kubenswrapper[4795]: E1205 08:25:55.747481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:55 crc kubenswrapper[4795]: E1205 08:25:55.747751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:56 crc kubenswrapper[4795]: I1205 08:25:56.746736 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:56 crc kubenswrapper[4795]: I1205 08:25:56.746743 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:56 crc kubenswrapper[4795]: E1205 08:25:56.747224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:56 crc kubenswrapper[4795]: E1205 08:25:56.747016 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:57 crc kubenswrapper[4795]: I1205 08:25:57.746480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:57 crc kubenswrapper[4795]: I1205 08:25:57.746655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:57 crc kubenswrapper[4795]: E1205 08:25:57.746917 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:25:57 crc kubenswrapper[4795]: E1205 08:25:57.746707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:58 crc kubenswrapper[4795]: I1205 08:25:58.747287 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:25:58 crc kubenswrapper[4795]: I1205 08:25:58.747332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:25:58 crc kubenswrapper[4795]: E1205 08:25:58.749590 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:25:58 crc kubenswrapper[4795]: E1205 08:25:58.750070 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:25:59 crc kubenswrapper[4795]: I1205 08:25:59.747053 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:25:59 crc kubenswrapper[4795]: I1205 08:25:59.747134 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:25:59 crc kubenswrapper[4795]: E1205 08:25:59.748211 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:25:59 crc kubenswrapper[4795]: E1205 08:25:59.748274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:00 crc kubenswrapper[4795]: I1205 08:26:00.747007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:00 crc kubenswrapper[4795]: I1205 08:26:00.747121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:00 crc kubenswrapper[4795]: E1205 08:26:00.747225 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:00 crc kubenswrapper[4795]: E1205 08:26:00.747349 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:01 crc kubenswrapper[4795]: I1205 08:26:01.747175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:01 crc kubenswrapper[4795]: E1205 08:26:01.747338 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:01 crc kubenswrapper[4795]: I1205 08:26:01.748267 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:01 crc kubenswrapper[4795]: E1205 08:26:01.748563 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.504383 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/1.log" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.505149 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/0.log" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.505250 4795 generic.go:334] "Generic (PLEG): container finished" podID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" containerID="927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46" exitCode=1 Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.505337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerDied","Data":"927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46"} Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.505433 4795 scope.go:117] "RemoveContainer" containerID="1bc0e8ac3386ac24e3e77ce822e12b2b8570cbd32a413ae40e280cd218eeb1a3" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.505970 4795 scope.go:117] "RemoveContainer" containerID="927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46" Dec 05 08:26:02 crc kubenswrapper[4795]: E1205 08:26:02.506242 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bhxnf_openshift-multus(9dd42ab7-1f98-4f50-ae12-15ec6587bc4e)\"" pod="openshift-multus/multus-bhxnf" podUID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.532190 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-f8h79" podStartSLOduration=95.532164912 podStartE2EDuration="1m35.532164912s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:25:45.462065445 +0000 UTC m=+97.034669224" watchObservedRunningTime="2025-12-05 08:26:02.532164912 +0000 UTC m=+114.104768651" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.747851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:02 crc kubenswrapper[4795]: I1205 08:26:02.747866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:02 crc kubenswrapper[4795]: E1205 08:26:02.750028 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:02 crc kubenswrapper[4795]: E1205 08:26:02.750238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:03 crc kubenswrapper[4795]: I1205 08:26:03.510124 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/1.log" Dec 05 08:26:03 crc kubenswrapper[4795]: I1205 08:26:03.765329 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:03 crc kubenswrapper[4795]: E1205 08:26:03.765530 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:03 crc kubenswrapper[4795]: I1205 08:26:03.766279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:03 crc kubenswrapper[4795]: E1205 08:26:03.766330 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:03 crc kubenswrapper[4795]: I1205 08:26:03.766945 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.515184 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/3.log" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.518806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerStarted","Data":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.519302 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.560996 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podStartSLOduration=97.560976109 podStartE2EDuration="1m37.560976109s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:04.560257848 +0000 UTC m=+116.132861597" watchObservedRunningTime="2025-12-05 08:26:04.560976109 +0000 UTC m=+116.133579848" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.749254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:04 crc kubenswrapper[4795]: E1205 08:26:04.749413 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.749664 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:04 crc kubenswrapper[4795]: E1205 08:26:04.749743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:04 crc kubenswrapper[4795]: I1205 08:26:04.782211 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8cnbm"] Dec 05 08:26:05 crc kubenswrapper[4795]: I1205 08:26:05.522808 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:05 crc kubenswrapper[4795]: E1205 08:26:05.523025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:05 crc kubenswrapper[4795]: I1205 08:26:05.746330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:05 crc kubenswrapper[4795]: I1205 08:26:05.746459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:05 crc kubenswrapper[4795]: E1205 08:26:05.746528 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:05 crc kubenswrapper[4795]: E1205 08:26:05.746737 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:06 crc kubenswrapper[4795]: I1205 08:26:06.746963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:06 crc kubenswrapper[4795]: E1205 08:26:06.747171 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:07 crc kubenswrapper[4795]: I1205 08:26:07.747066 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:07 crc kubenswrapper[4795]: I1205 08:26:07.747211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:07 crc kubenswrapper[4795]: I1205 08:26:07.747359 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:07 crc kubenswrapper[4795]: E1205 08:26:07.748059 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:07 crc kubenswrapper[4795]: E1205 08:26:07.748106 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:07 crc kubenswrapper[4795]: E1205 08:26:07.748172 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:08 crc kubenswrapper[4795]: E1205 08:26:08.746061 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 05 08:26:08 crc kubenswrapper[4795]: I1205 08:26:08.746412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:08 crc kubenswrapper[4795]: E1205 08:26:08.748533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:08 crc kubenswrapper[4795]: E1205 08:26:08.831895 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 08:26:09 crc kubenswrapper[4795]: I1205 08:26:09.747016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:09 crc kubenswrapper[4795]: I1205 08:26:09.747032 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:09 crc kubenswrapper[4795]: I1205 08:26:09.747059 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:09 crc kubenswrapper[4795]: E1205 08:26:09.747352 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:09 crc kubenswrapper[4795]: E1205 08:26:09.747467 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:09 crc kubenswrapper[4795]: E1205 08:26:09.747700 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:10 crc kubenswrapper[4795]: I1205 08:26:10.746925 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:10 crc kubenswrapper[4795]: E1205 08:26:10.747214 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:11 crc kubenswrapper[4795]: I1205 08:26:11.746911 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:11 crc kubenswrapper[4795]: I1205 08:26:11.746981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:11 crc kubenswrapper[4795]: E1205 08:26:11.747119 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:11 crc kubenswrapper[4795]: I1205 08:26:11.746916 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:11 crc kubenswrapper[4795]: E1205 08:26:11.747199 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:11 crc kubenswrapper[4795]: E1205 08:26:11.747333 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:12 crc kubenswrapper[4795]: I1205 08:26:12.747184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:12 crc kubenswrapper[4795]: E1205 08:26:12.747378 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:13 crc kubenswrapper[4795]: I1205 08:26:13.746863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:13 crc kubenswrapper[4795]: I1205 08:26:13.746949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:13 crc kubenswrapper[4795]: E1205 08:26:13.747094 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:13 crc kubenswrapper[4795]: E1205 08:26:13.747171 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:13 crc kubenswrapper[4795]: I1205 08:26:13.747726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:13 crc kubenswrapper[4795]: E1205 08:26:13.747845 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:13 crc kubenswrapper[4795]: E1205 08:26:13.834441 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 05 08:26:14 crc kubenswrapper[4795]: I1205 08:26:14.746709 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:14 crc kubenswrapper[4795]: I1205 08:26:14.747346 4795 scope.go:117] "RemoveContainer" containerID="927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46" Dec 05 08:26:14 crc kubenswrapper[4795]: E1205 08:26:14.747591 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:15 crc kubenswrapper[4795]: I1205 08:26:15.567799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/1.log" Dec 05 08:26:15 crc kubenswrapper[4795]: I1205 08:26:15.568266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerStarted","Data":"863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39"} Dec 05 08:26:15 crc kubenswrapper[4795]: I1205 08:26:15.746761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:15 crc kubenswrapper[4795]: I1205 08:26:15.746824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:15 crc kubenswrapper[4795]: I1205 08:26:15.746782 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:15 crc kubenswrapper[4795]: E1205 08:26:15.746926 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:15 crc kubenswrapper[4795]: E1205 08:26:15.746999 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:15 crc kubenswrapper[4795]: E1205 08:26:15.747075 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:16 crc kubenswrapper[4795]: I1205 08:26:16.746685 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:16 crc kubenswrapper[4795]: E1205 08:26:16.746874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:17 crc kubenswrapper[4795]: I1205 08:26:17.746578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:17 crc kubenswrapper[4795]: I1205 08:26:17.746593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:17 crc kubenswrapper[4795]: E1205 08:26:17.746822 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8cnbm" podUID="6c9f96ec-f615-4030-a78d-2dd56932c6c1" Dec 05 08:26:17 crc kubenswrapper[4795]: I1205 08:26:17.747067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:17 crc kubenswrapper[4795]: E1205 08:26:17.747171 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 05 08:26:17 crc kubenswrapper[4795]: E1205 08:26:17.747332 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 05 08:26:18 crc kubenswrapper[4795]: I1205 08:26:18.747010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:18 crc kubenswrapper[4795]: E1205 08:26:18.748475 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.747016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.747059 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.747235 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.750236 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.750533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.750535 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 08:26:19 crc kubenswrapper[4795]: I1205 08:26:19.751825 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 08:26:20 crc kubenswrapper[4795]: I1205 08:26:20.747045 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:20 crc kubenswrapper[4795]: I1205 08:26:20.752754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 08:26:20 crc kubenswrapper[4795]: I1205 08:26:20.752990 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 08:26:24 crc kubenswrapper[4795]: I1205 08:26:24.659694 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.143247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.187055 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng26g"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.187739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.193627 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.193639 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.194528 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.195320 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.195365 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.195396 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.195480 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.197404 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.199011 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x6pjq"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.199597 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.203262 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.221510 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.223319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.229068 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.230105 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.239574 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.239928 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8qz8"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.240345 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.240969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.241123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.231800 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.235763 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.241070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tn798"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.242714 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.243055 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.243445 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.234388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.234430 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.239772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.239820 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.241044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.242048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.244957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.245000 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.246474 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.247142 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qsf8w"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.250097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.251154 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.252823 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.253059 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.253084 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.253177 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.257258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.257839 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d7l5q"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.258148 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.258164 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.258533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.258754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.261658 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.262152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.268223 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.269581 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tgggp"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.270106 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.270497 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.270540 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.271065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.273532 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.273763 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.273888 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.274099 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.274330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.275137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.275675 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.276257 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.276546 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xcg4r"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.276842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.277011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.277475 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.278200 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.279092 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.279188 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.279306 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.279673 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.279999 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.280227 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.280292 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.280490 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.280647 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.287679 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.287834 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zpcq7"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.287955 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.289130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.294133 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.294413 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.294512 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.305677 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.310417 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwk8c"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.310706 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.311027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.311313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.311437 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.315268 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.343103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.364790 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.364881 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.378237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.378464 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.378713 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.378845 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379042 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379230 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379293 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28c9743-ac3d-478a-8b4d-92510027278f-serving-cert\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmfd\" (UniqueName: \"kubernetes.io/projected/44511bda-0717-4c08-adf2-7dd984e85120-kube-api-access-6wmfd\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-trusted-ca\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f2cafa-6fac-4139-b57d-94fb44307bb1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2f2cafa-6fac-4139-b57d-94fb44307bb1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-node-pullsecrets\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379477 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlzg\" (UniqueName: \"kubernetes.io/projected/9c19621b-c574-4047-8586-75272bf2fbcc-kube-api-access-vjlzg\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-config\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-serving-cert\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379578 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7l56\" (UniqueName: \"kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379606 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.383809 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379688 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379707 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379763 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394018 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.392314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-config\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab895138-2fff-4449-b071-d4ad7b35ff07-metrics-tls\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-encryption-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-stats-auth\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d33db5-212f-4884-b78b-159f06592142-serving-cert\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29k7k\" (UniqueName: \"kubernetes.io/projected/62d33db5-212f-4884-b78b-159f06592142-kube-api-access-29k7k\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v2rx\" (UniqueName: \"kubernetes.io/projected/19c362f4-26e6-4cd3-84dc-648d240524d3-kube-api-access-6v2rx\") pod \"migrator-59844c95c7-ps5dv\" (UID: \"19c362f4-26e6-4cd3-84dc-648d240524d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394721 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-serving-cert\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdclf\" (UniqueName: \"kubernetes.io/projected/c951e8a1-6a1b-44d9-9d36-0516636d679c-kube-api-access-wdclf\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394787 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccdhp\" (UniqueName: \"kubernetes.io/projected/ab895138-2fff-4449-b071-d4ad7b35ff07-kube-api-access-ccdhp\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-encryption-config\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxdsw\" (UniqueName: \"kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29rg5\" (UniqueName: \"kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-config\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394905 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-serving-cert\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-service-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-config\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.394998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395048 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbf5b12-68d6-4da4-90a8-48e275995388-trusted-ca\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-default-certificate\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dz9x\" (UniqueName: \"kubernetes.io/projected/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-kube-api-access-7dz9x\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e371a7d-d8ef-4440-a940-af49a6a2d364-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395202 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-image-import-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit-dir\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdzs\" (UniqueName: \"kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdbf5b12-68d6-4da4-90a8-48e275995388-metrics-tls\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44511bda-0717-4c08-adf2-7dd984e85120-service-ca-bundle\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c19621b-c574-4047-8586-75272bf2fbcc-machine-approver-tls\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrv2k\" (UniqueName: \"kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdpsv\" (UniqueName: \"kubernetes.io/projected/2e371a7d-d8ef-4440-a940-af49a6a2d364-kube-api-access-jdpsv\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3e829d4-5649-4dcf-a646-1f7873175d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-serving-cert\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395633 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rcxj\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-kube-api-access-7rcxj\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-metrics-certs\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-auth-proxy-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.395953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396057 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrfw\" (UniqueName: \"kubernetes.io/projected/f2f2cafa-6fac-4139-b57d-94fb44307bb1-kube-api-access-rfrfw\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396099 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396121 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-client\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-client\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzrg\" (UniqueName: \"kubernetes.io/projected/4ffe336e-9a69-4b3e-81c7-34bf5333858f-kube-api-access-zvzrg\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e829d4-5649-4dcf-a646-1f7873175d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396327 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4079335a-cdd7-48c7-8c64-7493bda89ed9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e371a7d-d8ef-4440-a940-af49a6a2d364-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-client\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdk9\" (UniqueName: \"kubernetes.io/projected/245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d-kube-api-access-jsdk9\") pod \"downloads-7954f5f757-qsf8w\" (UID: \"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d\") " pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgzc\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-kube-api-access-ddgzc\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396733 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.396753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.397102 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j54hb"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379839 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379898 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398183 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x6pjq"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398275 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379944 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.379986 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng26g"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380024 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380056 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380153 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380197 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398753 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398859 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-policies\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398898 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78bs\" (UniqueName: \"kubernetes.io/projected/d28c9743-ac3d-478a-8b4d-92510027278f-kube-api-access-v78bs\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.398925 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e829d4-5649-4dcf-a646-1f7873175d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380267 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/daf348e0-0463-4007-8696-5c1b1483348b-signing-key\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r77d\" (UniqueName: \"kubernetes.io/projected/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-kube-api-access-5r77d\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380433 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399176 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/daf348e0-0463-4007-8696-5c1b1483348b-signing-cabundle\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-dir\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380470 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwwzb\" (UniqueName: \"kubernetes.io/projected/4079335a-cdd7-48c7-8c64-7493bda89ed9-kube-api-access-wwwzb\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399314 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380515 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-images\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8b25\" (UniqueName: \"kubernetes.io/projected/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-kube-api-access-n8b25\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380546 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pmj\" (UniqueName: \"kubernetes.io/projected/b02abb39-40e1-4e8b-9d51-7c775f083f92-kube-api-access-66pmj\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380582 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.399443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzlf\" (UniqueName: \"kubernetes.io/projected/daf348e0-0463-4007-8696-5c1b1483348b-kube-api-access-ptzlf\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380634 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381001 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381214 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381247 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381279 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381365 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381426 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.381886 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.382150 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.382307 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.382346 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.382783 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.383152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.383432 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.386994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.387019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.392229 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.380238 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400397 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400479 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400534 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400561 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400605 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400832 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.400497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.401294 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.401987 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.403644 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.404683 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.405093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.406120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.407309 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.408286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.408396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.430918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.433794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.434081 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.435748 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.441231 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.441405 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.442082 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.442694 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.462044 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.462377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.462951 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.463466 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.463625 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.463792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.463639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.465286 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.466068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.466321 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.466652 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.470412 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.477711 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4bg7c"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.478168 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d7l5q"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.478185 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.478274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.478384 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.480591 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.482720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.483685 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.485537 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.485706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.487281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zpcq7"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.489565 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.489683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qsf8w"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.491749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.491854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8qz8"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.492110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-policies\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500274 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78bs\" (UniqueName: \"kubernetes.io/projected/d28c9743-ac3d-478a-8b4d-92510027278f-kube-api-access-v78bs\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e829d4-5649-4dcf-a646-1f7873175d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/daf348e0-0463-4007-8696-5c1b1483348b-signing-key\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r77d\" (UniqueName: \"kubernetes.io/projected/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-kube-api-access-5r77d\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/daf348e0-0463-4007-8696-5c1b1483348b-signing-cabundle\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-dir\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwwzb\" (UniqueName: \"kubernetes.io/projected/4079335a-cdd7-48c7-8c64-7493bda89ed9-kube-api-access-wwwzb\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500509 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bfv9k"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.501342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-images\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.500525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-images\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.501917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-policies\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8b25\" (UniqueName: \"kubernetes.io/projected/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-kube-api-access-n8b25\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502101 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pmj\" (UniqueName: \"kubernetes.io/projected/b02abb39-40e1-4e8b-9d51-7c775f083f92-kube-api-access-66pmj\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzlf\" (UniqueName: \"kubernetes.io/projected/daf348e0-0463-4007-8696-5c1b1483348b-kube-api-access-ptzlf\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28c9743-ac3d-478a-8b4d-92510027278f-serving-cert\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502753 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmfd\" (UniqueName: \"kubernetes.io/projected/44511bda-0717-4c08-adf2-7dd984e85120-kube-api-access-6wmfd\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502819 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f2cafa-6fac-4139-b57d-94fb44307bb1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2f2cafa-6fac-4139-b57d-94fb44307bb1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-node-pullsecrets\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlzg\" (UniqueName: \"kubernetes.io/projected/9c19621b-c574-4047-8586-75272bf2fbcc-kube-api-access-vjlzg\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-trusted-ca\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-config\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503026 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-serving-cert\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503066 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503088 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7l56\" (UniqueName: \"kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab895138-2fff-4449-b071-d4ad7b35ff07-metrics-tls\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-encryption-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-config\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d33db5-212f-4884-b78b-159f06592142-serving-cert\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29k7k\" (UniqueName: \"kubernetes.io/projected/62d33db5-212f-4884-b78b-159f06592142-kube-api-access-29k7k\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-stats-auth\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v2rx\" (UniqueName: \"kubernetes.io/projected/19c362f4-26e6-4cd3-84dc-648d240524d3-kube-api-access-6v2rx\") pod \"migrator-59844c95c7-ps5dv\" (UID: \"19c362f4-26e6-4cd3-84dc-648d240524d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-serving-cert\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502525 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-service-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdclf\" (UniqueName: \"kubernetes.io/projected/c951e8a1-6a1b-44d9-9d36-0516636d679c-kube-api-access-wdclf\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccdhp\" (UniqueName: \"kubernetes.io/projected/ab895138-2fff-4449-b071-d4ad7b35ff07-kube-api-access-ccdhp\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503459 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxdsw\" (UniqueName: \"kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29rg5\" (UniqueName: \"kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-config\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503553 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-serving-cert\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-encryption-config\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-service-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503663 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-config\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503746 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-default-certificate\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dz9x\" (UniqueName: \"kubernetes.io/projected/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-kube-api-access-7dz9x\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e371a7d-d8ef-4440-a940-af49a6a2d364-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbf5b12-68d6-4da4-90a8-48e275995388-trusted-ca\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.503865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-image-import-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit-dir\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdbf5b12-68d6-4da4-90a8-48e275995388-metrics-tls\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44511bda-0717-4c08-adf2-7dd984e85120-service-ca-bundle\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdzs\" (UniqueName: \"kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c19621b-c574-4047-8586-75272bf2fbcc-machine-approver-tls\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504249 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrv2k\" (UniqueName: \"kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdpsv\" (UniqueName: \"kubernetes.io/projected/2e371a7d-d8ef-4440-a940-af49a6a2d364-kube-api-access-jdpsv\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3e829d4-5649-4dcf-a646-1f7873175d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-serving-cert\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504428 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-metrics-certs\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rcxj\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-kube-api-access-7rcxj\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-auth-proxy-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrfw\" (UniqueName: \"kubernetes.io/projected/f2f2cafa-6fac-4139-b57d-94fb44307bb1-kube-api-access-rfrfw\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504945 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.504983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-client\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-client\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzrg\" (UniqueName: \"kubernetes.io/projected/4ffe336e-9a69-4b3e-81c7-34bf5333858f-kube-api-access-zvzrg\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e829d4-5649-4dcf-a646-1f7873175d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505155 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505173 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4079335a-cdd7-48c7-8c64-7493bda89ed9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e371a7d-d8ef-4440-a940-af49a6a2d364-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-client\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505435 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdk9\" (UniqueName: \"kubernetes.io/projected/245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d-kube-api-access-jsdk9\") pod \"downloads-7954f5f757-qsf8w\" (UID: \"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d\") " pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgzc\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-kube-api-access-ddgzc\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.505514 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.502659 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c951e8a1-6a1b-44d9-9d36-0516636d679c-audit-dir\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.509383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-config\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.511245 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-75dtr"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.511854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d28c9743-ac3d-478a-8b4d-92510027278f-serving-cert\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.511937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2f2cafa-6fac-4139-b57d-94fb44307bb1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.512365 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.512455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.512602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.512803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-node-pullsecrets\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.512744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2f2cafa-6fac-4139-b57d-94fb44307bb1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.514112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.514436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.515265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.515267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.515521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-encryption-config\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.515578 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.515904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.501887 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.516149 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-config\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.517082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.517096 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62d33db5-212f-4884-b78b-159f06592142-serving-cert\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.517437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e371a7d-d8ef-4440-a940-af49a6a2d364-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.518525 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.518562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d28c9743-ac3d-478a-8b4d-92510027278f-trusted-ca\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.519367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.520744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-image-import-ca\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.520799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit-dir\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.522050 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62d33db5-212f-4884-b78b-159f06592142-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.522728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-config\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.523292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.523447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536723 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwk8c"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536767 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536801 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bfv9k"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.527811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.528019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.528259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-serving-cert\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.528377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e371a7d-d8ef-4440-a940-af49a6a2d364-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.529408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.530056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.531906 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.532194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.532499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.532877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.533002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.533472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4ffe336e-9a69-4b3e-81c7-34bf5333858f-audit\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.534136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.535681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.535830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c19621b-c574-4047-8586-75272bf2fbcc-auth-proxy-config\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.537112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.536523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c19621b-c574-4047-8586-75272bf2fbcc-machine-approver-tls\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.525379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.526050 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.526375 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-serving-cert\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.526601 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.537356 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-encryption-config\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.527599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.527464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.529754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.537740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.539648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.539793 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.540303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-serving-cert\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.540354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4ffe336e-9a69-4b3e-81c7-34bf5333858f-etcd-client\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.540579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.540679 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.541109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c951e8a1-6a1b-44d9-9d36-0516636d679c-etcd-client\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.541221 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.542245 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.543253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.543875 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j54hb"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.544448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.545484 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tgggp"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.546834 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.547183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.548157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.548245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.551268 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tn798"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.551480 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.553196 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dknzh"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.554973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.555094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.564667 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.570830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.570850 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.573333 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.574257 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4bg7c"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.576026 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.577779 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dknzh"] Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.592053 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.609419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.619355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ab895138-2fff-4449-b071-d4ad7b35ff07-metrics-tls\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.631788 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.653492 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.670777 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.698256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.711525 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.725178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-default-certificate\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.732536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.749210 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.759873 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-stats-auth\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.771079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.777770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44511bda-0717-4c08-adf2-7dd984e85120-metrics-certs\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.790384 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.795044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdbf5b12-68d6-4da4-90a8-48e275995388-metrics-tls\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.809817 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.812006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44511bda-0717-4c08-adf2-7dd984e85120-service-ca-bundle\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.830389 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.850769 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.876795 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.879851 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bdbf5b12-68d6-4da4-90a8-48e275995388-trusted-ca\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.892980 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.905384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4079335a-cdd7-48c7-8c64-7493bda89ed9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.910630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.930837 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.950771 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.958974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-config\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.970939 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 08:26:25 crc kubenswrapper[4795]: I1205 08:26:25.991055 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.001307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-serving-cert\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.010961 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.020880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.031889 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.044147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-client\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.051696 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.070839 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.076429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b02abb39-40e1-4e8b-9d51-7c775f083f92-etcd-service-ca\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.091329 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.110534 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.131436 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.150563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.170000 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.184573 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e829d4-5649-4dcf-a646-1f7873175d2e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.190258 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.194806 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e829d4-5649-4dcf-a646-1f7873175d2e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.210163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.231690 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.250397 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.270329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.278397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/daf348e0-0463-4007-8696-5c1b1483348b-signing-key\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.289981 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.294257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/daf348e0-0463-4007-8696-5c1b1483348b-signing-cabundle\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.311216 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.350247 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.368970 4795 request.go:700] Waited for 1.017261394s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-operator-dockercfg-2bh8d&limit=500&resourceVersion=0 Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.370520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.391493 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.411438 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.417780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.433831 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.435526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.471599 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.491138 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.510667 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.529852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.550573 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.571134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.590739 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.609918 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.629836 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.649664 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.669513 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.690243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.710916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.731065 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.750667 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.770938 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.790478 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.810059 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.830154 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.850170 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.871183 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.891728 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.922181 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.930798 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.950884 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.969529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 08:26:26 crc kubenswrapper[4795]: I1205 08:26:26.991520 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.012525 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.031392 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.051242 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.100903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwwzb\" (UniqueName: \"kubernetes.io/projected/4079335a-cdd7-48c7-8c64-7493bda89ed9-kube-api-access-wwwzb\") pod \"package-server-manager-789f6589d5-ckn4t\" (UID: \"4079335a-cdd7-48c7-8c64-7493bda89ed9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.129786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pmj\" (UniqueName: \"kubernetes.io/projected/b02abb39-40e1-4e8b-9d51-7c775f083f92-kube-api-access-66pmj\") pod \"etcd-operator-b45778765-zpcq7\" (UID: \"b02abb39-40e1-4e8b-9d51-7c775f083f92\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.152264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlzg\" (UniqueName: \"kubernetes.io/projected/9c19621b-c574-4047-8586-75272bf2fbcc-kube-api-access-vjlzg\") pod \"machine-approver-56656f9798-mm8ll\" (UID: \"9c19621b-c574-4047-8586-75272bf2fbcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.157368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.167874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78bs\" (UniqueName: \"kubernetes.io/projected/d28c9743-ac3d-478a-8b4d-92510027278f-kube-api-access-v78bs\") pod \"console-operator-58897d9998-d7l5q\" (UID: \"d28c9743-ac3d-478a-8b4d-92510027278f\") " pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.184473 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.187237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r77d\" (UniqueName: \"kubernetes.io/projected/ea39bb65-aae0-48fe-ae6a-4736ab5cf336-kube-api-access-5r77d\") pod \"openshift-config-operator-7777fb866f-6fkrg\" (UID: \"ea39bb65-aae0-48fe-ae6a-4736ab5cf336\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.207200 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8b25\" (UniqueName: \"kubernetes.io/projected/bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70-kube-api-access-n8b25\") pod \"kube-storage-version-migrator-operator-b67b599dd-mbgnr\" (UID: \"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.208741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.217390 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.231448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dz9x\" (UniqueName: \"kubernetes.io/projected/59acd2a1-e0cc-439c-9e9e-a2ca39e05e52-kube-api-access-7dz9x\") pod \"machine-api-operator-5694c8668f-x6pjq\" (UID: \"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.249100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmfd\" (UniqueName: \"kubernetes.io/projected/44511bda-0717-4c08-adf2-7dd984e85120-kube-api-access-6wmfd\") pod \"router-default-5444994796-xcg4r\" (UID: \"44511bda-0717-4c08-adf2-7dd984e85120\") " pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.266191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxdsw\" (UniqueName: \"kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw\") pod \"console-f9d7485db-r8zdl\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.286263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29k7k\" (UniqueName: \"kubernetes.io/projected/62d33db5-212f-4884-b78b-159f06592142-kube-api-access-29k7k\") pod \"authentication-operator-69f744f599-v8qz8\" (UID: \"62d33db5-212f-4884-b78b-159f06592142\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.290923 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.301986 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.310250 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.331144 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.346908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.371868 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.385244 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.385435 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.387373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7l56\" (UniqueName: \"kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56\") pod \"route-controller-manager-6576b87f9c-hkgz2\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.388873 4795 request.go:700] Waited for 1.872799264s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/serviceaccounts/kube-storage-version-migrator-sa/token Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.404909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v2rx\" (UniqueName: \"kubernetes.io/projected/19c362f4-26e6-4cd3-84dc-648d240524d3-kube-api-access-6v2rx\") pod \"migrator-59844c95c7-ps5dv\" (UID: \"19c362f4-26e6-4cd3-84dc-648d240524d3\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.425474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29rg5\" (UniqueName: \"kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5\") pod \"collect-profiles-29415375-fcl8t\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.431085 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.435350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.458380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.470688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdclf\" (UniqueName: \"kubernetes.io/projected/c951e8a1-6a1b-44d9-9d36-0516636d679c-kube-api-access-wdclf\") pod \"apiserver-7bbb656c7d-6s54g\" (UID: \"c951e8a1-6a1b-44d9-9d36-0516636d679c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.473268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.483753 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccdhp\" (UniqueName: \"kubernetes.io/projected/ab895138-2fff-4449-b071-d4ad7b35ff07-kube-api-access-ccdhp\") pod \"dns-operator-744455d44c-tgggp\" (UID: \"ab895138-2fff-4449-b071-d4ad7b35ff07\") " pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.504569 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrfw\" (UniqueName: \"kubernetes.io/projected/f2f2cafa-6fac-4139-b57d-94fb44307bb1-kube-api-access-rfrfw\") pod \"openshift-controller-manager-operator-756b6f6bc6-mkj9d\" (UID: \"f2f2cafa-6fac-4139-b57d-94fb44307bb1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.523387 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzlf\" (UniqueName: \"kubernetes.io/projected/daf348e0-0463-4007-8696-5c1b1483348b-kube-api-access-ptzlf\") pod \"service-ca-9c57cc56f-cwk8c\" (UID: \"daf348e0-0463-4007-8696-5c1b1483348b\") " pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.528296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdzs\" (UniqueName: \"kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs\") pod \"oauth-openshift-558db77b4-tn798\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.530603 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.546474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.550879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.567840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.576068 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.585916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.593364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.605463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdk9\" (UniqueName: \"kubernetes.io/projected/245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d-kube-api-access-jsdk9\") pod \"downloads-7954f5f757-qsf8w\" (UID: \"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d\") " pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.630982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzrg\" (UniqueName: \"kubernetes.io/projected/4ffe336e-9a69-4b3e-81c7-34bf5333858f-kube-api-access-zvzrg\") pod \"apiserver-76f77b778f-ng26g\" (UID: \"4ffe336e-9a69-4b3e-81c7-34bf5333858f\") " pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.648427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgzc\" (UniqueName: \"kubernetes.io/projected/bdbf5b12-68d6-4da4-90a8-48e275995388-kube-api-access-ddgzc\") pod \"ingress-operator-5b745b69d9-4gsqk\" (UID: \"bdbf5b12-68d6-4da4-90a8-48e275995388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.662373 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:27 crc kubenswrapper[4795]: W1205 08:26:27.664790 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c19621b_c574_4047_8586_75272bf2fbcc.slice/crio-292175c8f8433648bf5a734ad6e992d90bfab0741ebf053c6034f961178821b1 WatchSource:0}: Error finding container 292175c8f8433648bf5a734ad6e992d90bfab0741ebf053c6034f961178821b1: Status 404 returned error can't find the container with id 292175c8f8433648bf5a734ad6e992d90bfab0741ebf053c6034f961178821b1 Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.672736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdpsv\" (UniqueName: \"kubernetes.io/projected/2e371a7d-d8ef-4440-a940-af49a6a2d364-kube-api-access-jdpsv\") pod \"openshift-apiserver-operator-796bbdcf4f-b5r5w\" (UID: \"2e371a7d-d8ef-4440-a940-af49a6a2d364\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.685262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3e829d4-5649-4dcf-a646-1f7873175d2e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5frx8\" (UID: \"e3e829d4-5649-4dcf-a646-1f7873175d2e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:27 crc kubenswrapper[4795]: W1205 08:26:27.687161 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44511bda_0717_4c08_adf2_7dd984e85120.slice/crio-f28525cbcec22b030430150d36207aee123e62907317bd03abde8b79976506ce WatchSource:0}: Error finding container f28525cbcec22b030430150d36207aee123e62907317bd03abde8b79976506ce: Status 404 returned error can't find the container with id f28525cbcec22b030430150d36207aee123e62907317bd03abde8b79976506ce Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.708896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rcxj\" (UniqueName: \"kubernetes.io/projected/15cfaa37-1f25-42e6-8723-4d1e043ad9a2-kube-api-access-7rcxj\") pod \"cluster-image-registry-operator-dc59b4c8b-h58zn\" (UID: \"15cfaa37-1f25-42e6-8723-4d1e043ad9a2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.715817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.728123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrv2k\" (UniqueName: \"kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k\") pod \"controller-manager-879f6c89f-vksgm\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.730249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.745208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.755541 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.764929 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.784733 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.794180 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.799241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848789 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f49bbd69-4725-4cdc-9904-33d0755bde86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqrsb\" (UniqueName: \"kubernetes.io/projected/f49bbd69-4725-4cdc-9904-33d0755bde86-kube-api-access-cqrsb\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.848985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzxqk\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.849007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.849025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.849046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.849072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.849090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-config\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: E1205 08:26:27.849516 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.349501148 +0000 UTC m=+139.922104887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.884508 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.908085 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.970047 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971681 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971711 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-srv-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971806 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-srv-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971831 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34897f01-f688-4dc5-8fdc-4468365baa92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0d87aa9-b51d-46a0-b060-37fa44a12238-proxy-tls\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00e1441e-0e57-4d91-8799-643363d5297f-cert\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.971971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f220ce80-999b-41dd-a674-1fb462d12667-config\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:27 crc kubenswrapper[4795]: E1205 08:26:27.972014 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.47198348 +0000 UTC m=+140.044587219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-webhook-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83923f36-bf49-4a3d-a398-bbee1e13dfeb-proxy-tls\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j7xp\" (UniqueName: \"kubernetes.io/projected/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-kube-api-access-4j7xp\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972195 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-config\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-images\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972230 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4bg\" (UniqueName: \"kubernetes.io/projected/ad75302f-73c5-4927-acb2-b1b748d41a24-kube-api-access-hn4bg\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-tmpfs\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-certs\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-node-bootstrap-token\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-apiservice-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.972447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.974562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.976066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.979343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-registration-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.980373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-config\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-kube-api-access-m6v7h\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4njqq\" (UniqueName: \"kubernetes.io/projected/6717d13e-f472-49df-9a8c-0519ed3c556e-kube-api-access-4njqq\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981497 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhj8\" (UniqueName: \"kubernetes.io/projected/34897f01-f688-4dc5-8fdc-4468365baa92-kube-api-access-qrhj8\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4697a500-88a3-4061-9e72-c60cce09d33b-config\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.981851 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-profile-collector-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83923f36-bf49-4a3d-a398-bbee1e13dfeb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpxfc\" (UniqueName: \"kubernetes.io/projected/c0d87aa9-b51d-46a0-b060-37fa44a12238-kube-api-access-dpxfc\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985814 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-socket-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75m9g\" (UniqueName: \"kubernetes.io/projected/b934a26f-b01a-44b2-9b01-068de3c5c9b6-kube-api-access-75m9g\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985855 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-plugins-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985900 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjck2\" (UniqueName: \"kubernetes.io/projected/f220ce80-999b-41dd-a674-1fb462d12667-kube-api-access-qjck2\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4kw\" (UniqueName: \"kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.985977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w85g8\" (UniqueName: \"kubernetes.io/projected/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-kube-api-access-w85g8\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdwf9\" (UniqueName: \"kubernetes.io/projected/83923f36-bf49-4a3d-a398-bbee1e13dfeb-kube-api-access-gdwf9\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqrsb\" (UniqueName: \"kubernetes.io/projected/f49bbd69-4725-4cdc-9904-33d0755bde86-kube-api-access-cqrsb\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f49bbd69-4725-4cdc-9904-33d0755bde86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-mountpoint-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f220ce80-999b-41dd-a674-1fb462d12667-serving-cert\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4697a500-88a3-4061-9e72-c60cce09d33b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-config-volume\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-metrics-tls\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzxqk\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvhx7\" (UniqueName: \"kubernetes.io/projected/51007333-db4c-4be3-961b-c6b114574db6-kube-api-access-pvhx7\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6717d13e-f472-49df-9a8c-0519ed3c556e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw75j\" (UniqueName: \"kubernetes.io/projected/00e1441e-0e57-4d91-8799-643363d5297f-kube-api-access-gw75j\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4697a500-88a3-4061-9e72-c60cce09d33b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.986494 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-csi-data-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:27 crc kubenswrapper[4795]: E1205 08:26:27.994285 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.494262879 +0000 UTC m=+140.066866608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:27 crc kubenswrapper[4795]: I1205 08:26:27.995499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.013354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.016003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f49bbd69-4725-4cdc-9904-33d0755bde86-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.025147 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.047913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca452ca6-76ab-4ce8-898f-5a7b35a7137b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vp2pv\" (UID: \"ca452ca6-76ab-4ce8-898f-5a7b35a7137b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.065677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqrsb\" (UniqueName: \"kubernetes.io/projected/f49bbd69-4725-4cdc-9904-33d0755bde86-kube-api-access-cqrsb\") pod \"cluster-samples-operator-665b6dd947-pswnm\" (UID: \"f49bbd69-4725-4cdc-9904-33d0755bde86\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.068938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.069858 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.070020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.073298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzxqk\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhj8\" (UniqueName: \"kubernetes.io/projected/34897f01-f688-4dc5-8fdc-4468365baa92-kube-api-access-qrhj8\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087450 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-profile-collector-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4697a500-88a3-4061-9e72-c60cce09d33b-config\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83923f36-bf49-4a3d-a398-bbee1e13dfeb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpxfc\" (UniqueName: \"kubernetes.io/projected/c0d87aa9-b51d-46a0-b060-37fa44a12238-kube-api-access-dpxfc\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-socket-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75m9g\" (UniqueName: \"kubernetes.io/projected/b934a26f-b01a-44b2-9b01-068de3c5c9b6-kube-api-access-75m9g\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjck2\" (UniqueName: \"kubernetes.io/projected/f220ce80-999b-41dd-a674-1fb462d12667-kube-api-access-qjck2\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-plugins-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4kw\" (UniqueName: \"kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087658 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w85g8\" (UniqueName: \"kubernetes.io/projected/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-kube-api-access-w85g8\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdwf9\" (UniqueName: \"kubernetes.io/projected/83923f36-bf49-4a3d-a398-bbee1e13dfeb-kube-api-access-gdwf9\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-mountpoint-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087715 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4697a500-88a3-4061-9e72-c60cce09d33b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087729 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f220ce80-999b-41dd-a674-1fb462d12667-serving-cert\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-config-volume\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-metrics-tls\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvhx7\" (UniqueName: \"kubernetes.io/projected/51007333-db4c-4be3-961b-c6b114574db6-kube-api-access-pvhx7\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6717d13e-f472-49df-9a8c-0519ed3c556e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw75j\" (UniqueName: \"kubernetes.io/projected/00e1441e-0e57-4d91-8799-643363d5297f-kube-api-access-gw75j\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4697a500-88a3-4061-9e72-c60cce09d33b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-csi-data-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-srv-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-srv-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34897f01-f688-4dc5-8fdc-4468365baa92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0d87aa9-b51d-46a0-b060-37fa44a12238-proxy-tls\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.087986 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f220ce80-999b-41dd-a674-1fb462d12667-config\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00e1441e-0e57-4d91-8799-643363d5297f-cert\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-webhook-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83923f36-bf49-4a3d-a398-bbee1e13dfeb-proxy-tls\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j7xp\" (UniqueName: \"kubernetes.io/projected/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-kube-api-access-4j7xp\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088093 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-images\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4bg\" (UniqueName: \"kubernetes.io/projected/ad75302f-73c5-4927-acb2-b1b748d41a24-kube-api-access-hn4bg\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-tmpfs\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088168 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-certs\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-node-bootstrap-token\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-apiservice-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-registration-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-kube-api-access-m6v7h\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.088349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4njqq\" (UniqueName: \"kubernetes.io/projected/6717d13e-f472-49df-9a8c-0519ed3c556e-kube-api-access-4njqq\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.088706 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.588686863 +0000 UTC m=+140.161290602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.089252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.091730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c0d87aa9-b51d-46a0-b060-37fa44a12238-images\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.091860 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-csi-data-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.092714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4697a500-88a3-4061-9e72-c60cce09d33b-config\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.093052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.094144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/83923f36-bf49-4a3d-a398-bbee1e13dfeb-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.094640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-socket-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.094957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-plugins-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.095225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-mountpoint-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.099459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f220ce80-999b-41dd-a674-1fb462d12667-config\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.102669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-registration-dir\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.103819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-tmpfs\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.105596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-profile-collector-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.107091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-apiservice-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.108438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-node-bootstrap-token\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.108439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-webhook-cert\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.114371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.114934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/00e1441e-0e57-4d91-8799-643363d5297f-cert\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.115174 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-config-volume\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.115565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.118095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad75302f-73c5-4927-acb2-b1b748d41a24-srv-cert\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.122691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83923f36-bf49-4a3d-a398-bbee1e13dfeb-proxy-tls\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.123231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0d87aa9-b51d-46a0-b060-37fa44a12238-proxy-tls\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.123319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/34897f01-f688-4dc5-8fdc-4468365baa92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.123663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6717d13e-f472-49df-9a8c-0519ed3c556e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.123817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f220ce80-999b-41dd-a674-1fb462d12667-serving-cert\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.124279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4697a500-88a3-4061-9e72-c60cce09d33b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.124788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-metrics-tls\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.128092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4njqq\" (UniqueName: \"kubernetes.io/projected/6717d13e-f472-49df-9a8c-0519ed3c556e-kube-api-access-4njqq\") pod \"multus-admission-controller-857f4d67dd-j54hb\" (UID: \"6717d13e-f472-49df-9a8c-0519ed3c556e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.130840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b934a26f-b01a-44b2-9b01-068de3c5c9b6-certs\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.136699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.141967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51007333-db4c-4be3-961b-c6b114574db6-srv-cert\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.164476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhj8\" (UniqueName: \"kubernetes.io/projected/34897f01-f688-4dc5-8fdc-4468365baa92-kube-api-access-qrhj8\") pod \"control-plane-machine-set-operator-78cbb6b69f-2ffkd\" (UID: \"34897f01-f688-4dc5-8fdc-4468365baa92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.172843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j7xp\" (UniqueName: \"kubernetes.io/projected/371f5b94-9d02-45c4-8f06-2aeb582f1bbc-kube-api-access-4j7xp\") pod \"dns-default-dknzh\" (UID: \"371f5b94-9d02-45c4-8f06-2aeb582f1bbc\") " pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.189924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.190386 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.690371331 +0000 UTC m=+140.262975070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.266939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75m9g\" (UniqueName: \"kubernetes.io/projected/b934a26f-b01a-44b2-9b01-068de3c5c9b6-kube-api-access-75m9g\") pod \"machine-config-server-75dtr\" (UID: \"b934a26f-b01a-44b2-9b01-068de3c5c9b6\") " pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.269152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpxfc\" (UniqueName: \"kubernetes.io/projected/c0d87aa9-b51d-46a0-b060-37fa44a12238-kube-api-access-dpxfc\") pod \"machine-config-operator-74547568cd-jdmf7\" (UID: \"c0d87aa9-b51d-46a0-b060-37fa44a12238\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.270427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.270894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-75dtr" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.271171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.297210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w85g8\" (UniqueName: \"kubernetes.io/projected/c31b365f-c7a0-48ca-9118-141e6ac9b8fb-kube-api-access-w85g8\") pod \"packageserver-d55dfcdfc-2v7bs\" (UID: \"c31b365f-c7a0-48ca-9118-141e6ac9b8fb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.299997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.301827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.302724 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.802705454 +0000 UTC m=+140.375309193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.303116 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.303674 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdwf9\" (UniqueName: \"kubernetes.io/projected/83923f36-bf49-4a3d-a398-bbee1e13dfeb-kube-api-access-gdwf9\") pod \"machine-config-controller-84d6567774-gbmkb\" (UID: \"83923f36-bf49-4a3d-a398-bbee1e13dfeb\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.304517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4kw\" (UniqueName: \"kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw\") pod \"marketplace-operator-79b997595-47r47\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.325135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjck2\" (UniqueName: \"kubernetes.io/projected/f220ce80-999b-41dd-a674-1fb462d12667-kube-api-access-qjck2\") pod \"service-ca-operator-777779d784-ql7nq\" (UID: \"f220ce80-999b-41dd-a674-1fb462d12667\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.327790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4bg\" (UniqueName: \"kubernetes.io/projected/ad75302f-73c5-4927-acb2-b1b748d41a24-kube-api-access-hn4bg\") pod \"olm-operator-6b444d44fb-qcmrn\" (UID: \"ad75302f-73c5-4927-acb2-b1b748d41a24\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.356584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6v7h\" (UniqueName: \"kubernetes.io/projected/a0ad2af1-a387-49d4-9e6c-3dadfe6800d7-kube-api-access-m6v7h\") pod \"csi-hostpathplugin-bfv9k\" (UID: \"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7\") " pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.379509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvhx7\" (UniqueName: \"kubernetes.io/projected/51007333-db4c-4be3-961b-c6b114574db6-kube-api-access-pvhx7\") pod \"catalog-operator-68c6474976-vrdc4\" (UID: \"51007333-db4c-4be3-961b-c6b114574db6\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.397442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw75j\" (UniqueName: \"kubernetes.io/projected/00e1441e-0e57-4d91-8799-643363d5297f-kube-api-access-gw75j\") pod \"ingress-canary-4bg7c\" (UID: \"00e1441e-0e57-4d91-8799-643363d5297f\") " pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.404788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.405545 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:28.905528616 +0000 UTC m=+140.478132355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.414891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4697a500-88a3-4061-9e72-c60cce09d33b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v7z2z\" (UID: \"4697a500-88a3-4061-9e72-c60cce09d33b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.438975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.444934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.455530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.464690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.473422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.484303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.492886 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.501697 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-x6pjq"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.502123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.504664 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-d7l5q"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.506255 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.506683 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.006659988 +0000 UTC m=+140.579263727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.509925 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.511739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4bg7c" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.561881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.607917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.608306 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.108290176 +0000 UTC m=+140.680893915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.676551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xcg4r" event={"ID":"44511bda-0717-4c08-adf2-7dd984e85120","Type":"ContainerStarted","Data":"f28525cbcec22b030430150d36207aee123e62907317bd03abde8b79976506ce"} Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.689077 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" event={"ID":"9c19621b-c574-4047-8586-75272bf2fbcc","Type":"ContainerStarted","Data":"292175c8f8433648bf5a734ad6e992d90bfab0741ebf053c6034f961178821b1"} Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.705051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-75dtr" event={"ID":"b934a26f-b01a-44b2-9b01-068de3c5c9b6","Type":"ContainerStarted","Data":"ff0b5404b3cc68c223b45f3bde18fb566c99c1fe1b5ca559ad3f68a4b31cd903"} Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.705382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.710491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.710977 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.210941182 +0000 UTC m=+140.783544931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.737004 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.812141 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.813272 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.313259379 +0000 UTC m=+140.885863118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.851054 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v8qz8"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.862119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.862135 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.862151 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zpcq7"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.868582 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.868686 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tn798"] Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.913091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.913286 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.413244956 +0000 UTC m=+140.985848695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: I1205 08:26:28.913384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:28 crc kubenswrapper[4795]: E1205 08:26:28.913779 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.413765823 +0000 UTC m=+140.986369562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:28 crc kubenswrapper[4795]: W1205 08:26:28.962690 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc10b5ef_30bb_4a65_a9ae_e4bed4b83b70.slice/crio-cf28a4d7620c3746222fd18a57414889b7a8997d41c4ef5f3711404c11db1308 WatchSource:0}: Error finding container cf28a4d7620c3746222fd18a57414889b7a8997d41c4ef5f3711404c11db1308: Status 404 returned error can't find the container with id cf28a4d7620c3746222fd18a57414889b7a8997d41c4ef5f3711404c11db1308 Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.018440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.019118 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.519088428 +0000 UTC m=+141.091692167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.038367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.106729 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tgggp"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.122894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.165368 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.180248 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.189599 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.195505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.195569 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d"] Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.145602 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.645541069 +0000 UTC m=+141.218144808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.224978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.225197 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.725174865 +0000 UTC m=+141.297778604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.225446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.226243 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.726234586 +0000 UTC m=+141.298838325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.326998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.327436 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.827408489 +0000 UTC m=+141.400012238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.343931 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.354238 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.437015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.437441 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:29.937423074 +0000 UTC m=+141.510026813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: W1205 08:26:29.471200 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbf5b12_68d6_4da4_90a8_48e275995388.slice/crio-77c948e9ae4f9026ba4833eca325637ce7e4c5e401c23064ee7b3bcfeec9f140 WatchSource:0}: Error finding container 77c948e9ae4f9026ba4833eca325637ce7e4c5e401c23064ee7b3bcfeec9f140: Status 404 returned error can't find the container with id 77c948e9ae4f9026ba4833eca325637ce7e4c5e401c23064ee7b3bcfeec9f140 Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.538182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.539094 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.039072831 +0000 UTC m=+141.611676570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.559328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng26g"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.641296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.658382 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.158344669 +0000 UTC m=+141.730948408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.749586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.763689 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.263647015 +0000 UTC m=+141.836250754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.774183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.774729 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.274714662 +0000 UTC m=+141.847318401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.807128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" event={"ID":"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821","Type":"ContainerStarted","Data":"5543b7dccf9e7de1d3db3465bb8b3dfd3581a34f8e00faab1996c776858d7eec"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.810967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" event={"ID":"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae","Type":"ContainerStarted","Data":"82e40d64e4aa9c65da43d12a968851dce4aa407b924729b1e89dd16d50a02f95"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.816733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" event={"ID":"ea39bb65-aae0-48fe-ae6a-4736ab5cf336","Type":"ContainerStarted","Data":"99754a4513f33d9adf451a3942ff03092d401d6e39b031e6fcb43a09db07eaa8"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.820280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" event={"ID":"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52","Type":"ContainerStarted","Data":"156ffea091abedbe2f3601608421d24bca1b9a88f86ddd790559f13f0954863e"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.832729 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qsf8w"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.833079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" event={"ID":"bdbf5b12-68d6-4da4-90a8-48e275995388","Type":"ContainerStarted","Data":"77c948e9ae4f9026ba4833eca325637ce7e4c5e401c23064ee7b3bcfeec9f140"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.834450 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.857459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cwk8c"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.864360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" event={"ID":"ad75302f-73c5-4927-acb2-b1b748d41a24","Type":"ContainerStarted","Data":"b10204b454991460ae4012a62778f520db02ad70aaef1bfbb8ee807b1ff47ffc"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.869377 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r8zdl" event={"ID":"67c6f735-c0f7-4539-a2d4-0785b4238435","Type":"ContainerStarted","Data":"be1324b836f833629e0679c3440e0b9a7166ce0f051ba18af249f5603076b35e"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.870213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" event={"ID":"d28c9743-ac3d-478a-8b4d-92510027278f","Type":"ContainerStarted","Data":"ad904ba87b94e67114c5ca5dcb74defbd84414967be26cd754185a66740188e4"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.871101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" event={"ID":"e3e829d4-5649-4dcf-a646-1f7873175d2e","Type":"ContainerStarted","Data":"634ca03878382efed4812d16852fddb779a8fecc485e3ab9c055f4a14d817953"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.881028 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-j54hb"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.881890 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.882084 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.382041547 +0000 UTC m=+141.954645286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.882225 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.882652 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.382644855 +0000 UTC m=+141.955248584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.883280 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w"] Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.893363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" event={"ID":"c951e8a1-6a1b-44d9-9d36-0516636d679c","Type":"ContainerStarted","Data":"19e3f820e02642e704a02a608b44152f281ae4980b33faaade2a95e89587e47f"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.903173 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" event={"ID":"15cfaa37-1f25-42e6-8723-4d1e043ad9a2","Type":"ContainerStarted","Data":"4430f2488800c078ac65820954e3d20f63cb420ea3f0bb93a6f5d85859136be8"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.917891 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" event={"ID":"9c19621b-c574-4047-8586-75272bf2fbcc","Type":"ContainerStarted","Data":"c05e875bf03672ce192a91006384f725c618596385177b722c80f55b81308d5c"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.926964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" event={"ID":"b02abb39-40e1-4e8b-9d51-7c775f083f92","Type":"ContainerStarted","Data":"8ac274d7352357e023cd4140cc1d612dc766656f9e90161b5cef82ae8fbdfe7d"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.936937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" event={"ID":"ab895138-2fff-4449-b071-d4ad7b35ff07","Type":"ContainerStarted","Data":"a2968ee727580f93d8a59412b8afe8f01bee6efc35c7c26800cd45e6dbbfced4"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.945884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" event={"ID":"f2f2cafa-6fac-4139-b57d-94fb44307bb1","Type":"ContainerStarted","Data":"5ec7e18fec4c56413c3cc33ff6e5b0bc1e601f5dde84ab297f372dbfbf364abd"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.947659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" event={"ID":"41bb386f-8261-4203-a385-f2918e5f9718","Type":"ContainerStarted","Data":"a2a9e8b1be6b75caf9bdfdfd68b277b86d8aff01dbd4d665dcd183ab8bc2b596"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.948904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" event={"ID":"a5628405-485f-42a4-ba10-db97a6df14b5","Type":"ContainerStarted","Data":"459ccda15d82f50dac0c57abe2a688eedefaeedde67e373ca53a90ff21b79c6c"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.952114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" event={"ID":"19c362f4-26e6-4cd3-84dc-648d240524d3","Type":"ContainerStarted","Data":"473fce3b2165e5bc571e6256591d723cacd11ccad9983734b4fca8238e42ae58"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.960360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" event={"ID":"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70","Type":"ContainerStarted","Data":"cf28a4d7620c3746222fd18a57414889b7a8997d41c4ef5f3711404c11db1308"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.964912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" event={"ID":"4079335a-cdd7-48c7-8c64-7493bda89ed9","Type":"ContainerStarted","Data":"539ee54a84237c242c5e0813c0cc5065fc9e1063970980d56db0f7e225a39f76"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.967013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xcg4r" event={"ID":"44511bda-0717-4c08-adf2-7dd984e85120","Type":"ContainerStarted","Data":"d492554bef532ec5025077b507e8d6c2a680f9ee728eeb9a5ffab736e09f7a79"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.968262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" event={"ID":"62d33db5-212f-4884-b78b-159f06592142","Type":"ContainerStarted","Data":"284ef5ae8712b772962ebafaf4fec6b1852c1694cd9827a377f456b629a80b28"} Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.984686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.985472 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.485419996 +0000 UTC m=+142.058023735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:29 crc kubenswrapper[4795]: I1205 08:26:29.990323 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:29 crc kubenswrapper[4795]: E1205 08:26:29.991023 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.491004981 +0000 UTC m=+142.063608720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.095445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.096421 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.596381229 +0000 UTC m=+142.168985088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.106521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.201489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.202434 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.702413996 +0000 UTC m=+142.275017735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.206574 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4bg7c"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.219982 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dknzh"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.243521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.248431 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.281110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.295215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.302663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.303145 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.803123345 +0000 UTC m=+142.375727084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.313547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.327750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.352203 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.394888 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xcg4r" podStartSLOduration=123.394865478 podStartE2EDuration="2m3.394865478s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:30.368120518 +0000 UTC m=+141.940724267" watchObservedRunningTime="2025-12-05 08:26:30.394865478 +0000 UTC m=+141.967469207" Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.405840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.406988 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:30.906952957 +0000 UTC m=+142.479556696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.413601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bfv9k"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.426406 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7"] Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.435848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.443218 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:30 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:30 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:30 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.443285 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:30 crc kubenswrapper[4795]: W1205 08:26:30.443381 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4697a500_88a3_4061_9e72_c60cce09d33b.slice/crio-b93424def32240461d84ab0c5d970d7cc97a44d1b5b010de963001a3d483ab5f WatchSource:0}: Error finding container b93424def32240461d84ab0c5d970d7cc97a44d1b5b010de963001a3d483ab5f: Status 404 returned error can't find the container with id b93424def32240461d84ab0c5d970d7cc97a44d1b5b010de963001a3d483ab5f Dec 05 08:26:30 crc kubenswrapper[4795]: W1205 08:26:30.487820 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0ad2af1_a387_49d4_9e6c_3dadfe6800d7.slice/crio-58800faba670f3ea1c57405ac5bed11463074848d9a95efe09799a9238afa050 WatchSource:0}: Error finding container 58800faba670f3ea1c57405ac5bed11463074848d9a95efe09799a9238afa050: Status 404 returned error can't find the container with id 58800faba670f3ea1c57405ac5bed11463074848d9a95efe09799a9238afa050 Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.507588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.508577 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.008557962 +0000 UTC m=+142.581161701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.610708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.611496 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.111482777 +0000 UTC m=+142.684086516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.719415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.719867 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.219836952 +0000 UTC m=+142.792440691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.831468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.831991 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.331968539 +0000 UTC m=+142.904572278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.937388 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.938020 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.437988886 +0000 UTC m=+143.010592625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.938080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:30 crc kubenswrapper[4795]: E1205 08:26:30.938745 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.438706137 +0000 UTC m=+143.011309876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.991038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" event={"ID":"41bb386f-8261-4203-a385-f2918e5f9718","Type":"ContainerStarted","Data":"ccfd521165f6178751f2eab9fc0e316567ab3fa16e3dec56a96962d2d68d5a26"} Dec 05 08:26:30 crc kubenswrapper[4795]: I1205 08:26:30.992354 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.000775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" event={"ID":"4079335a-cdd7-48c7-8c64-7493bda89ed9","Type":"ContainerStarted","Data":"7279dc9f88826a86631cb2e8fde31d79fcbb759cc816ab6821a5cf03ea5afeec"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.003268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4bg7c" event={"ID":"00e1441e-0e57-4d91-8799-643363d5297f","Type":"ContainerStarted","Data":"d53ff826df89fa9f15d71f9c97e2cdc811518fcb17b07117242ced52f3d22423"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.007764 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tn798 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" start-of-body= Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.007999 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": dial tcp 10.217.0.29:6443: connect: connection refused" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.012586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" event={"ID":"b02abb39-40e1-4e8b-9d51-7c775f083f92","Type":"ContainerStarted","Data":"f35db86f0730a28b786c4d750607bed55c00b7017ede5b37a46431b0f6e9a6bd"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.030232 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" podStartSLOduration=124.030191594 podStartE2EDuration="2m4.030191594s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.030102471 +0000 UTC m=+142.602706210" watchObservedRunningTime="2025-12-05 08:26:31.030191594 +0000 UTC m=+142.602795333" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.036211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" event={"ID":"6717d13e-f472-49df-9a8c-0519ed3c556e","Type":"ContainerStarted","Data":"53039527e094efaa7295dbf36a2a20cb7ec57a0382281e6afec8a60382fb3f97"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.039328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.041375 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.541346364 +0000 UTC m=+143.113950103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.044985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" event={"ID":"bc10b5ef-30bb-4a65-a9ae-e4bed4b83b70","Type":"ContainerStarted","Data":"2ffc5259f74312bc8bcdf5be0232b081ed89da75b42351f2a548f7d282de21bd"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.051769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r8zdl" event={"ID":"67c6f735-c0f7-4539-a2d4-0785b4238435","Type":"ContainerStarted","Data":"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.063153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dknzh" event={"ID":"371f5b94-9d02-45c4-8f06-2aeb582f1bbc","Type":"ContainerStarted","Data":"b6a771e33d0f50e37b2118f47b0908457e4d744680ccb4017efc83229e622720"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.072568 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zpcq7" podStartSLOduration=124.072551717 podStartE2EDuration="2m4.072551717s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.065577391 +0000 UTC m=+142.638181130" watchObservedRunningTime="2025-12-05 08:26:31.072551717 +0000 UTC m=+142.645155456" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.079255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" event={"ID":"bdbf5b12-68d6-4da4-90a8-48e275995388","Type":"ContainerStarted","Data":"074f903b463087ba93c2036ce682a28addfd1da0d6f32081943bff39338e3faf"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.102633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" event={"ID":"c31b365f-c7a0-48ca-9118-141e6ac9b8fb","Type":"ContainerStarted","Data":"e5ae6011fb6489f09fb8274c3771cbc0067ae3fafcb1b8bfd67ef80e26d6a3c4"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.117143 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-r8zdl" podStartSLOduration=124.117096455 podStartE2EDuration="2m4.117096455s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.112624563 +0000 UTC m=+142.685228302" watchObservedRunningTime="2025-12-05 08:26:31.117096455 +0000 UTC m=+142.689700194" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.125911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" event={"ID":"c0d87aa9-b51d-46a0-b060-37fa44a12238","Type":"ContainerStarted","Data":"36b42334a576de91483ee9f39956db1ec80366d243577320229c774a0f35e601"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.129259 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" event={"ID":"f220ce80-999b-41dd-a674-1fb462d12667","Type":"ContainerStarted","Data":"0f0ca08f1d02b7290261f690d54ee18644a806663cb74da65d877fb7fc61622a"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.143527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.148304 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.648286147 +0000 UTC m=+143.220889886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.155264 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-mbgnr" podStartSLOduration=124.155238454 podStartE2EDuration="2m4.155238454s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.142684912 +0000 UTC m=+142.715288661" watchObservedRunningTime="2025-12-05 08:26:31.155238454 +0000 UTC m=+142.727842193" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.166097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" event={"ID":"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7","Type":"ContainerStarted","Data":"58800faba670f3ea1c57405ac5bed11463074848d9a95efe09799a9238afa050"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.168448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" event={"ID":"2e371a7d-d8ef-4440-a940-af49a6a2d364","Type":"ContainerStarted","Data":"eb2c864e23c885703eb48662fd3078ca987cdfa7eee20580d2100b5f13303ad9"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.192504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" event={"ID":"34897f01-f688-4dc5-8fdc-4468365baa92","Type":"ContainerStarted","Data":"659b14a646213a54f65d2693c59c2f6b10741ccdc5ba4b2e9f2ba2a833944370"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.194962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" event={"ID":"daf348e0-0463-4007-8696-5c1b1483348b","Type":"ContainerStarted","Data":"6ec77d34cf8a278af56c1e3b461173ec8a2ac129e3e6f95216a5e0fb275d1770"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.227817 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qsf8w" event={"ID":"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d","Type":"ContainerStarted","Data":"682ffe271781d4a25403d09b7416f522a3659c6bb6c95a36b8f902e51adc935f"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.246302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.246785 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.746761141 +0000 UTC m=+143.319364880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.255288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" event={"ID":"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52","Type":"ContainerStarted","Data":"2106aac406d3334d897fce3575f9c87558baf580e03bd81fef21612b4893ad67"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.269868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" event={"ID":"ca452ca6-76ab-4ce8-898f-5a7b35a7137b","Type":"ContainerStarted","Data":"8670f6597cae7ef6a9f384a893f9d248bdec48806a6b10574d4cd3c0da66eb59"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.349047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.349873 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.849857021 +0000 UTC m=+143.422460760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.388982 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" event={"ID":"d28c9743-ac3d-478a-8b4d-92510027278f","Type":"ContainerStarted","Data":"011a56b0eeaf81ba30ffb96d39deaf53854d29fd9ca0e25129c46b7167519bfb"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.390466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.400452 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-d7l5q container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.400530 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" podUID="d28c9743-ac3d-478a-8b4d-92510027278f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.409898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" event={"ID":"4ffe336e-9a69-4b3e-81c7-34bf5333858f","Type":"ContainerStarted","Data":"ba4e441e5bf4b081f964431fb9f5e63e1ff56021fe10d050dd5af1094b663e42"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.444885 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:31 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:31 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:31 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.444949 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.455440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" event={"ID":"c951e8a1-6a1b-44d9-9d36-0516636d679c","Type":"ContainerStarted","Data":"a4f5f54ba5f1b7c4752ddb49fed40dc56b7acc675ef43eaf741f271ebbd005ee"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.457105 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.458403 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:31.958381322 +0000 UTC m=+143.530985061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.486002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" event={"ID":"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821","Type":"ContainerStarted","Data":"ccc4fa14d91dac69258353003e0ef4116337a90fa2a6d3594c4e85764a686450"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.503761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" podStartSLOduration=124.503737903 podStartE2EDuration="2m4.503737903s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.431760134 +0000 UTC m=+143.004363893" watchObservedRunningTime="2025-12-05 08:26:31.503737903 +0000 UTC m=+143.076341642" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.519915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" event={"ID":"0849cac0-adb5-41b4-a67a-3f7dc195e78a","Type":"ContainerStarted","Data":"79a6032a22553b2d5653ffa59d16fb157c7a2f8b4531b006b0257224cf29dfe1"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.545013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" event={"ID":"19c362f4-26e6-4cd3-84dc-648d240524d3","Type":"ContainerStarted","Data":"404f3eaa4f77bfa3f41f0b637cdc2f7ddc3890fccbb2154baf0ce1e06396f16c"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.549266 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" podStartSLOduration=124.549236939 podStartE2EDuration="2m4.549236939s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.544819729 +0000 UTC m=+143.117423468" watchObservedRunningTime="2025-12-05 08:26:31.549236939 +0000 UTC m=+143.121840698" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.577311 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.582893 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.082874144 +0000 UTC m=+143.655477883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.583108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" event={"ID":"83923f36-bf49-4a3d-a398-bbee1e13dfeb","Type":"ContainerStarted","Data":"03f3ee02f5e17fb61e2e500bbff8ff67b7e164abd765b423c1936085c6065a59"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.586892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" event={"ID":"51007333-db4c-4be3-961b-c6b114574db6","Type":"ContainerStarted","Data":"30a5375a683f9e40f125bc6894dc7e17766fc19636215484e380c1a19b3e4382"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.619076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" event={"ID":"4697a500-88a3-4061-9e72-c60cce09d33b","Type":"ContainerStarted","Data":"b93424def32240461d84ab0c5d970d7cc97a44d1b5b010de963001a3d483ab5f"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.631067 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" event={"ID":"ab895138-2fff-4449-b071-d4ad7b35ff07","Type":"ContainerStarted","Data":"82482b3ba5fdfa25dc87664c0a393b9f4ae23e6542b02768a45269a9bc26249b"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.634523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" event={"ID":"62d33db5-212f-4884-b78b-159f06592142","Type":"ContainerStarted","Data":"6790db950d5ea4048915511095ed8ac5db7dd4ee079d754b00cbf5d97b3f429f"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.637554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-75dtr" event={"ID":"b934a26f-b01a-44b2-9b01-068de3c5c9b6","Type":"ContainerStarted","Data":"24fc73f1546640b6a9232df500520b6f8e1cf008592ec59c08ab130d58dd5c20"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.646980 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" podStartSLOduration=124.64695372 podStartE2EDuration="2m4.64695372s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.639478259 +0000 UTC m=+143.212081998" watchObservedRunningTime="2025-12-05 08:26:31.64695372 +0000 UTC m=+143.219557459" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.651898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" event={"ID":"f49bbd69-4725-4cdc-9904-33d0755bde86","Type":"ContainerStarted","Data":"ca7e664cdb41f9a3be036deb2a8c1b7977f5d61c3a43b7b9baed409977c050c1"} Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.665945 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-75dtr" podStartSLOduration=6.665924251 podStartE2EDuration="6.665924251s" podCreationTimestamp="2025-12-05 08:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.664382645 +0000 UTC m=+143.236986374" watchObservedRunningTime="2025-12-05 08:26:31.665924251 +0000 UTC m=+143.238527990" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.678509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.681196 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.181166422 +0000 UTC m=+143.753770161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.710627 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" podStartSLOduration=124.710584593 podStartE2EDuration="2m4.710584593s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:31.70945493 +0000 UTC m=+143.282058669" watchObservedRunningTime="2025-12-05 08:26:31.710584593 +0000 UTC m=+143.283188332" Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.782453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.785056 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.285035365 +0000 UTC m=+143.857639104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.883929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.884203 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.384154007 +0000 UTC m=+143.956757746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.884511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.884942 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.38492392 +0000 UTC m=+143.957527659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.992183 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.992272 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.492226915 +0000 UTC m=+144.064830654 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:31 crc kubenswrapper[4795]: I1205 08:26:31.993914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:31 crc kubenswrapper[4795]: E1205 08:26:31.994472 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.49444754 +0000 UTC m=+144.067051279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.095477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.096042 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.596022316 +0000 UTC m=+144.168626055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.197080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.197416 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.697404205 +0000 UTC m=+144.270007944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.298402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.299007 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.79898677 +0000 UTC m=+144.371590509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.399951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.400522 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:32.900496713 +0000 UTC m=+144.473100452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.451954 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:32 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:32 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:32 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.452013 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.502172 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.502744 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.002723397 +0000 UTC m=+144.575327136 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.604092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.605025 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.104986232 +0000 UTC m=+144.677589971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.705240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qsf8w" event={"ID":"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d","Type":"ContainerStarted","Data":"d510b8f9c85b2fd239a150339c8515456d849b5c54ecb28507cdac1871f1ef99"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.707074 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.709454 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.709948 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.209910157 +0000 UTC m=+144.782513896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.713763 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.713824 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.726852 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" event={"ID":"59acd2a1-e0cc-439c-9e9e-a2ca39e05e52","Type":"ContainerStarted","Data":"76d80e9f622d9ca3685116bd3b00454cc473c4a33aaaeb2bef4e769a25d69e4f"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.747265 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qsf8w" podStartSLOduration=125.747242251 podStartE2EDuration="2m5.747242251s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:32.74686547 +0000 UTC m=+144.319469209" watchObservedRunningTime="2025-12-05 08:26:32.747242251 +0000 UTC m=+144.319845990" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.776446 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vksgm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.776540 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.779148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" event={"ID":"a5628405-485f-42a4-ba10-db97a6df14b5","Type":"ContainerStarted","Data":"a5ffdf754d494e56c98f12d75b2b69f03ff93dc68dc320d72b38bf8cd67db986"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.779221 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.826997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.827407 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.327392293 +0000 UTC m=+144.899996022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.839059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" event={"ID":"f2f2cafa-6fac-4139-b57d-94fb44307bb1","Type":"ContainerStarted","Data":"874415d31ac1b0f9f6b0ac7ad09b9ed9acf26bc9e5ee8bd41b6d61521ee178e3"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.891236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4bg7c" event={"ID":"00e1441e-0e57-4d91-8799-643363d5297f","Type":"ContainerStarted","Data":"5b38a53cba9147cca337e3691e802ee13ba8946cf50235b3d0f3543728c35607"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.906268 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mkj9d" podStartSLOduration=125.906237785 podStartE2EDuration="2m5.906237785s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:32.906112431 +0000 UTC m=+144.478716170" watchObservedRunningTime="2025-12-05 08:26:32.906237785 +0000 UTC m=+144.478841524" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.907509 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" podStartSLOduration=125.907497732 podStartE2EDuration="2m5.907497732s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:32.833121562 +0000 UTC m=+144.405725301" watchObservedRunningTime="2025-12-05 08:26:32.907497732 +0000 UTC m=+144.480101471" Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.936788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:32 crc kubenswrapper[4795]: E1205 08:26:32.938103 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.438078917 +0000 UTC m=+145.010682666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.941324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h58zn" event={"ID":"15cfaa37-1f25-42e6-8723-4d1e043ad9a2","Type":"ContainerStarted","Data":"7d9db1359ffba24efcb14ae6fe2610dc5a1b6e273a79cf1915418332db640fdc"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.962586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" event={"ID":"ad75302f-73c5-4927-acb2-b1b748d41a24","Type":"ContainerStarted","Data":"e4a3e015e772928d21921793d3276cff62a77099c6590dc8b6634096e776ab84"} Dec 05 08:26:32 crc kubenswrapper[4795]: I1205 08:26:32.964310 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.019286 4795 generic.go:334] "Generic (PLEG): container finished" podID="c951e8a1-6a1b-44d9-9d36-0516636d679c" containerID="a4f5f54ba5f1b7c4752ddb49fed40dc56b7acc675ef43eaf741f271ebbd005ee" exitCode=0 Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.019381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" event={"ID":"c951e8a1-6a1b-44d9-9d36-0516636d679c","Type":"ContainerDied","Data":"a4f5f54ba5f1b7c4752ddb49fed40dc56b7acc675ef43eaf741f271ebbd005ee"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.029460 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4bg7c" podStartSLOduration=8.02944165 podStartE2EDuration="8.02944165s" podCreationTimestamp="2025-12-05 08:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:32.962324844 +0000 UTC m=+144.534928583" watchObservedRunningTime="2025-12-05 08:26:33.02944165 +0000 UTC m=+144.602045389" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.030282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" event={"ID":"6717d13e-f472-49df-9a8c-0519ed3c556e","Type":"ContainerStarted","Data":"7b88caf090cc9bcec4e20324edd92f38314c4254b028f184cd4e01cf5fd181e8"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.038525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.040306 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.540280901 +0000 UTC m=+145.112884640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.055179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" event={"ID":"c0d87aa9-b51d-46a0-b060-37fa44a12238","Type":"ContainerStarted","Data":"44367a5f50447241fafaa4b1584ec4187486ca130a3a6114178ce71e7a935e3c"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.056497 4795 generic.go:334] "Generic (PLEG): container finished" podID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerID="5370c3b9f52498d8d68d002547aafadfddc52bd57ca2aedd2d68f2c757037832" exitCode=0 Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.056539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" event={"ID":"ea39bb65-aae0-48fe-ae6a-4736ab5cf336","Type":"ContainerDied","Data":"5370c3b9f52498d8d68d002547aafadfddc52bd57ca2aedd2d68f2c757037832"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.105260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" event={"ID":"83923f36-bf49-4a3d-a398-bbee1e13dfeb","Type":"ContainerStarted","Data":"b68b539bde59201bb57b33d322754f511837d9361c84bdbbc6470e2d0f6e1860"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.122898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" event={"ID":"c31b365f-c7a0-48ca-9118-141e6ac9b8fb","Type":"ContainerStarted","Data":"7fe97e1cc9696ad8cee4697b9e762b260daf7e939992829562e692e5a622b41a"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.124785 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.146547 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.147745 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.647722769 +0000 UTC m=+145.220326508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.151727 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" podStartSLOduration=125.151702776 podStartE2EDuration="2m5.151702776s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:33.042289909 +0000 UTC m=+144.614893648" watchObservedRunningTime="2025-12-05 08:26:33.151702776 +0000 UTC m=+144.724306515" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.152657 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2v7bs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.152703 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" podUID="c31b365f-c7a0-48ca-9118-141e6ac9b8fb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.181359 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qcmrn" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.190747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" event={"ID":"bdbf5b12-68d6-4da4-90a8-48e275995388","Type":"ContainerStarted","Data":"a5db8a7230b6f908d366bae2d1f60752f811093bd90f0193f73fa8ea0d2e3865"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.202815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" event={"ID":"4ffe336e-9a69-4b3e-81c7-34bf5333858f","Type":"ContainerStarted","Data":"7d2314383dc2c9babec05aa96ef363c84629d94a99246657fd7bf7e28bc03df2"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.223945 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" event={"ID":"e3e829d4-5649-4dcf-a646-1f7873175d2e","Type":"ContainerStarted","Data":"4c7a33ba51a6dbd1a45cd27a744f9f352164dbfdc3464acbe8f99832d9bfaa7e"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.241509 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" podStartSLOduration=125.241485382 podStartE2EDuration="2m5.241485382s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:33.240122082 +0000 UTC m=+144.812725821" watchObservedRunningTime="2025-12-05 08:26:33.241485382 +0000 UTC m=+144.814089121" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.242095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" event={"ID":"2e371a7d-d8ef-4440-a940-af49a6a2d364","Type":"ContainerStarted","Data":"526244a4117672bd5c087a2a3d929195b25a5956c558f7172aadc795a742eadf"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.246357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" event={"ID":"f220ce80-999b-41dd-a674-1fb462d12667","Type":"ContainerStarted","Data":"ac1616b2dee0deff67e99b03005b1b3d6d2faa47d1b259f2ebedfc8e9ddb74a1"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.247755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.249526 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.74950558 +0000 UTC m=+145.322109319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.277267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" event={"ID":"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae","Type":"ContainerStarted","Data":"49ff7c92330d91d2ce189ef283662c4a6ea7b7cbfe40705fdd5556a3f97db13d"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.279071 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.287241 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5frx8" podStartSLOduration=126.287216956 podStartE2EDuration="2m6.287216956s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:33.286719231 +0000 UTC m=+144.859322970" watchObservedRunningTime="2025-12-05 08:26:33.287216956 +0000 UTC m=+144.859820695" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.307156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" event={"ID":"9c19621b-c574-4047-8586-75272bf2fbcc","Type":"ContainerStarted","Data":"c6172099135b1e8bc760150ba97ff8f5e01e6a79927fd4206b2e9b704f5c41db"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.318052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" event={"ID":"34897f01-f688-4dc5-8fdc-4468365baa92","Type":"ContainerStarted","Data":"4bb784ba37a92fd69c355dcc1b4fb5da7cd3e16938ab009d8897670a0f4831e3"} Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.349997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.351350 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.851329343 +0000 UTC m=+145.423933082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.395189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.452432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.456259 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:33.956245566 +0000 UTC m=+145.528849305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.463574 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:33 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:33 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:33 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.463712 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.570501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.570630 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.070596439 +0000 UTC m=+145.643200178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.571585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.572085 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.072062202 +0000 UTC m=+145.644665941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.680143 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.680946 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.180915842 +0000 UTC m=+145.753519591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.693578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.724965 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4gsqk" podStartSLOduration=126.724941525 podStartE2EDuration="2m6.724941525s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:33.641052273 +0000 UTC m=+145.213656012" watchObservedRunningTime="2025-12-05 08:26:33.724941525 +0000 UTC m=+145.297545264" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.782513 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.282495538 +0000 UTC m=+145.855099277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.782104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.887134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.887486 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.387464763 +0000 UTC m=+145.960068502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.952281 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ql7nq" podStartSLOduration=125.95225941 podStartE2EDuration="2m5.95225941s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:33.947712976 +0000 UTC m=+145.520316705" watchObservedRunningTime="2025-12-05 08:26:33.95225941 +0000 UTC m=+145.524863149" Dec 05 08:26:33 crc kubenswrapper[4795]: I1205 08:26:33.995509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:33 crc kubenswrapper[4795]: E1205 08:26:33.995968 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.495947443 +0000 UTC m=+146.068551182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.097453 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.097849 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.597828367 +0000 UTC m=+146.170432106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.137601 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mm8ll" podStartSLOduration=127.137576852 podStartE2EDuration="2m7.137576852s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.001511717 +0000 UTC m=+145.574115466" watchObservedRunningTime="2025-12-05 08:26:34.137576852 +0000 UTC m=+145.710180601" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.203037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.203867 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.703851923 +0000 UTC m=+146.276455662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.285391 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" podStartSLOduration=126.285361764 podStartE2EDuration="2m6.285361764s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.139780508 +0000 UTC m=+145.712384247" watchObservedRunningTime="2025-12-05 08:26:34.285361764 +0000 UTC m=+145.857965503" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.306708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.307330 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.807306044 +0000 UTC m=+146.379909783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.333640 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tn798 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.29:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.333756 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.29:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.390956 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2ffkd" podStartSLOduration=126.390924948 podStartE2EDuration="2m6.390924948s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.287291972 +0000 UTC m=+145.859895711" watchObservedRunningTime="2025-12-05 08:26:34.390924948 +0000 UTC m=+145.963528687" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.410483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.410963 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:34.91094541 +0000 UTC m=+146.483549149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.417218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" event={"ID":"ea39bb65-aae0-48fe-ae6a-4736ab5cf336","Type":"ContainerStarted","Data":"e473e7b3dab851d303e99a912b0808ee3f65b30e07eab99afc312c7d05566af9"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.418069 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.426448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" event={"ID":"0849cac0-adb5-41b4-a67a-3f7dc195e78a","Type":"ContainerStarted","Data":"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.427522 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.446700 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47r47 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.446779 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.480463 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:34 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:34 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:34 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.480545 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.482448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" event={"ID":"19c362f4-26e6-4cd3-84dc-648d240524d3","Type":"ContainerStarted","Data":"7c26d59683c941f718b6507b4c343595f9ee7835029b3e1f9fccaa93749e4eb1"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.484664 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b5r5w" podStartSLOduration=127.484636759 podStartE2EDuration="2m7.484636759s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.392402251 +0000 UTC m=+145.965006010" watchObservedRunningTime="2025-12-05 08:26:34.484636759 +0000 UTC m=+146.057240498" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.509493 4795 generic.go:334] "Generic (PLEG): container finished" podID="4ffe336e-9a69-4b3e-81c7-34bf5333858f" containerID="7d2314383dc2c9babec05aa96ef363c84629d94a99246657fd7bf7e28bc03df2" exitCode=0 Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.509651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" event={"ID":"4ffe336e-9a69-4b3e-81c7-34bf5333858f","Type":"ContainerDied","Data":"7d2314383dc2c9babec05aa96ef363c84629d94a99246657fd7bf7e28bc03df2"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.509687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" event={"ID":"4ffe336e-9a69-4b3e-81c7-34bf5333858f","Type":"ContainerStarted","Data":"f76e5aa64659dc36da8f3c7ca29277132e5061a4c3300e0271f055f20053e255"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.511111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.511534 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.011494154 +0000 UTC m=+146.584097893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.584546 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podStartSLOduration=126.584519795 podStartE2EDuration="2m6.584519795s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.579697482 +0000 UTC m=+146.152301221" watchObservedRunningTime="2025-12-05 08:26:34.584519795 +0000 UTC m=+146.157123534" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.594251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" event={"ID":"6717d13e-f472-49df-9a8c-0519ed3c556e","Type":"ContainerStarted","Data":"452f346f98107ac18545e96fe20521a8ceb7da128a712958ec33c03a494c0de2"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.612938 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.614635 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.114598295 +0000 UTC m=+146.687202044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.632929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" event={"ID":"f49bbd69-4725-4cdc-9904-33d0755bde86","Type":"ContainerStarted","Data":"71c832a7c21d7d54aa3c4afab5d6d25b22ac6dffed526fa15f597554f028d92d"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.632993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" event={"ID":"f49bbd69-4725-4cdc-9904-33d0755bde86","Type":"ContainerStarted","Data":"1a0415866d4fe284de7ccd47a56ce25e1e519c914ffae47b4960175762b66190"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.666492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" event={"ID":"c0d87aa9-b51d-46a0-b060-37fa44a12238","Type":"ContainerStarted","Data":"e5b5d9d88ebed853dd8857c9e6d16341e9cac56512e020d0de8af84d8a22646d"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.681750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" event={"ID":"4697a500-88a3-4061-9e72-c60cce09d33b","Type":"ContainerStarted","Data":"ed743ca4e99747d5ccae896ce4912f62dbe1d67fc45d05028d17a2638c4e8e0d"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.707503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" event={"ID":"c951e8a1-6a1b-44d9-9d36-0516636d679c","Type":"ContainerStarted","Data":"e2dbc7dbe1507124a20033d90d5f283cf4d8730554a2d1674f245ac3d5bdfdf2"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.715904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" event={"ID":"51007333-db4c-4be3-961b-c6b114574db6","Type":"ContainerStarted","Data":"f4e79f440a6e3f584e6df2d00eef943b38613e2389c5285d09db411b88030a75"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.717474 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.718298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.720761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ps5dv" podStartSLOduration=127.720745834 podStartE2EDuration="2m7.720745834s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.720118076 +0000 UTC m=+146.292721815" watchObservedRunningTime="2025-12-05 08:26:34.720745834 +0000 UTC m=+146.293349573" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.722790 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podStartSLOduration=127.722783945 podStartE2EDuration="2m7.722783945s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.666213601 +0000 UTC m=+146.238817340" watchObservedRunningTime="2025-12-05 08:26:34.722783945 +0000 UTC m=+146.295387674" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.723261 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.223243928 +0000 UTC m=+146.795847667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.733516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" event={"ID":"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7","Type":"ContainerStarted","Data":"b8350cbd772971d1458908f9ffa9b196433eb9070264b57155d29e43ebe6f3a2"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.733791 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vrdc4 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.733863 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" podUID="51007333-db4c-4be3-961b-c6b114574db6" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.775713 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.775754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dknzh" event={"ID":"371f5b94-9d02-45c4-8f06-2aeb582f1bbc","Type":"ContainerStarted","Data":"49e8c69cef3ee4c6d4bd354784eb8d63562f52932168253439d27b17835bcb44"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.775775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dknzh" event={"ID":"371f5b94-9d02-45c4-8f06-2aeb582f1bbc","Type":"ContainerStarted","Data":"ef6029114b57f56ca17409b2b7f29a1db85b72c77c6faa34d3714e9a44f9f86a"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.784991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" event={"ID":"ab895138-2fff-4449-b071-d4ad7b35ff07","Type":"ContainerStarted","Data":"fe632e10fc84c963b2a76aa07df0f0b9ba99abb3125d31416bb5b3cf5e153e1f"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.812533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" event={"ID":"83923f36-bf49-4a3d-a398-bbee1e13dfeb","Type":"ContainerStarted","Data":"91e4830322428dc24748de10d10667fafed2b65f15a30fc149c78a9687b19a44"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.824165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.827467 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.327445482 +0000 UTC m=+146.900049221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.829390 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" event={"ID":"4079335a-cdd7-48c7-8c64-7493bda89ed9","Type":"ContainerStarted","Data":"0f06322e4b9049dd02a095aba54740b41364ea96c39d5dec62018007860514c2"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.829477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.840093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" event={"ID":"daf348e0-0463-4007-8696-5c1b1483348b","Type":"ContainerStarted","Data":"25e820fa7728896a0b0d8b6b09de409fa02c750abe669603fd71932cf797a919"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.859761 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jdmf7" podStartSLOduration=126.859739646 podStartE2EDuration="2m6.859739646s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.790017884 +0000 UTC m=+146.362621623" watchObservedRunningTime="2025-12-05 08:26:34.859739646 +0000 UTC m=+146.432343385" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.886915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" event={"ID":"ca452ca6-76ab-4ce8-898f-5a7b35a7137b","Type":"ContainerStarted","Data":"6260dad8e778246a009938f3d4130078f4adaaceca6e7e95df8b11927190716e"} Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.891907 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.891961 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.928104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.930449 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.430427878 +0000 UTC m=+147.003031617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.931012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.931185 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pswnm" podStartSLOduration=127.93116295 podStartE2EDuration="2m7.93116295s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.870872746 +0000 UTC m=+146.443476485" watchObservedRunningTime="2025-12-05 08:26:34.93116295 +0000 UTC m=+146.503766689" Dec 05 08:26:34 crc kubenswrapper[4795]: E1205 08:26:34.931694 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.431683645 +0000 UTC m=+147.004287384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:34 crc kubenswrapper[4795]: I1205 08:26:34.953533 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-j54hb" podStartSLOduration=126.953502981 podStartE2EDuration="2m6.953502981s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:34.921473343 +0000 UTC m=+146.494077082" watchObservedRunningTime="2025-12-05 08:26:34.953502981 +0000 UTC m=+146.526106720" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.005023 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" podStartSLOduration=127.004981504 podStartE2EDuration="2m7.004981504s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.001577633 +0000 UTC m=+146.574181382" watchObservedRunningTime="2025-12-05 08:26:35.004981504 +0000 UTC m=+146.577585243" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.035577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.038424 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.538395842 +0000 UTC m=+147.110999581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.065804 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v7z2z" podStartSLOduration=128.065780832 podStartE2EDuration="2m8.065780832s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.06367582 +0000 UTC m=+146.636279569" watchObservedRunningTime="2025-12-05 08:26:35.065780832 +0000 UTC m=+146.638384571" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.139295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.139695 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.639680268 +0000 UTC m=+147.212284007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.167417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.175511 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" podStartSLOduration=127.175486198 podStartE2EDuration="2m7.175486198s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.124774818 +0000 UTC m=+146.697378567" watchObservedRunningTime="2025-12-05 08:26:35.175486198 +0000 UTC m=+146.748089937" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.240688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.241064 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.741043637 +0000 UTC m=+147.313647376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.248452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-x6pjq" podStartSLOduration=127.248428506 podStartE2EDuration="2m7.248428506s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.244957493 +0000 UTC m=+146.817561232" watchObservedRunningTime="2025-12-05 08:26:35.248428506 +0000 UTC m=+146.821032245" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.249497 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" podStartSLOduration=127.249487337 podStartE2EDuration="2m7.249487337s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.176015573 +0000 UTC m=+146.748619312" watchObservedRunningTime="2025-12-05 08:26:35.249487337 +0000 UTC m=+146.822091076" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.336706 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vp2pv" podStartSLOduration=128.336676206 podStartE2EDuration="2m8.336676206s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.28273658 +0000 UTC m=+146.855340319" watchObservedRunningTime="2025-12-05 08:26:35.336676206 +0000 UTC m=+146.909279935" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.342793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.343254 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.843239201 +0000 UTC m=+147.415842940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.397086 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tgggp" podStartSLOduration=128.397066633 podStartE2EDuration="2m8.397066633s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.395337601 +0000 UTC m=+146.967941350" watchObservedRunningTime="2025-12-05 08:26:35.397066633 +0000 UTC m=+146.969670372" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.398630 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-cwk8c" podStartSLOduration=127.398623788 podStartE2EDuration="2m7.398623788s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.338022486 +0000 UTC m=+146.910626225" watchObservedRunningTime="2025-12-05 08:26:35.398623788 +0000 UTC m=+146.971227527" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.427126 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dknzh" podStartSLOduration=10.427097752 podStartE2EDuration="10.427097752s" podCreationTimestamp="2025-12-05 08:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.422573807 +0000 UTC m=+146.995177556" watchObservedRunningTime="2025-12-05 08:26:35.427097752 +0000 UTC m=+146.999701491" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.444076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.444690 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:35.944663941 +0000 UTC m=+147.517267680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.449380 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:35 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:35 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:35 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.449459 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.546387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.546852 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.046834893 +0000 UTC m=+147.619438632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.552137 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-gbmkb" podStartSLOduration=127.552115709 podStartE2EDuration="2m7.552115709s" podCreationTimestamp="2025-12-05 08:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:35.508689785 +0000 UTC m=+147.081293534" watchObservedRunningTime="2025-12-05 08:26:35.552115709 +0000 UTC m=+147.124719448" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.647540 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.647741 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.147707207 +0000 UTC m=+147.720310946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.647871 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.648316 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.148304086 +0000 UTC m=+147.720907825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.701669 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.749321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.749526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.749562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.749636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.749657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.750056 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.250033485 +0000 UTC m=+147.822637214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.771146 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.772186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.772776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.850973 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.851404 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.351384273 +0000 UTC m=+147.923988022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.869537 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.890742 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2v7bs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.890823 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" podUID="c31b365f-c7a0-48ca-9118-141e6ac9b8fb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.909046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" event={"ID":"4ffe336e-9a69-4b3e-81c7-34bf5333858f","Type":"ContainerStarted","Data":"4f0d3bd1311b3cdf1bb5ac118d4131a5f6071283ba3f0dc9fb05458ad976c6de"} Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.930694 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47r47 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.930748 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.931794 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.931847 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.952025 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.952272 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.452234957 +0000 UTC m=+148.024838696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.952364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:35 crc kubenswrapper[4795]: E1205 08:26:35.952745 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.452734341 +0000 UTC m=+148.025338080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.963935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 05 08:26:35 crc kubenswrapper[4795]: I1205 08:26:35.977075 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.051086 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vrdc4" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.058174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.060255 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.560227622 +0000 UTC m=+148.132831361 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.082518 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.160479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.160868 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.660853149 +0000 UTC m=+148.233456898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.235257 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.235352 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.235452 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.235467 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.264872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.265564 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.765541365 +0000 UTC m=+148.338145094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.366716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.367164 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.867148971 +0000 UTC m=+148.439752710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.444345 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:36 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:36 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:36 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.444473 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.468153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.468553 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:36.968531061 +0000 UTC m=+148.541134800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.569532 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.570020 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.070002183 +0000 UTC m=+148.642605922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.670799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.671274 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.171251888 +0000 UTC m=+148.743855627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.772552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.772979 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.272965787 +0000 UTC m=+148.845569526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.804337 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" podStartSLOduration=129.804314904 podStartE2EDuration="2m9.804314904s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:36.477501187 +0000 UTC m=+148.050104926" watchObservedRunningTime="2025-12-05 08:26:36.804314904 +0000 UTC m=+148.376918643" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.875799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.876558 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.376536731 +0000 UTC m=+148.949140470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.941252 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-2v7bs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.941345 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" podUID="c31b365f-c7a0-48ca-9118-141e6ac9b8fb" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.979291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" event={"ID":"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7","Type":"ContainerStarted","Data":"f84bd80d1c3ddf6a477a38f6dbe910ed16c221b9f6c419634f00e56e1fdd2394"} Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.980672 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.980766 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47r47 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.980834 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.980753 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 05 08:26:36 crc kubenswrapper[4795]: I1205 08:26:36.982666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:36 crc kubenswrapper[4795]: E1205 08:26:36.982986 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.48297229 +0000 UTC m=+149.055576029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.085926 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.087890 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.587866232 +0000 UTC m=+149.160469981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.188992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.189337 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.689325975 +0000 UTC m=+149.261929714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.290632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.291074 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.791053944 +0000 UTC m=+149.363657683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.386938 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.387009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.390872 4795 patch_prober.go:28] interesting pod/console-f9d7485db-r8zdl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.390934 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-r8zdl" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.392633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.392982 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.892969899 +0000 UTC m=+149.465573638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.412915 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.426040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.443451 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:37 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:37 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:37 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.443509 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.443566 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.453360 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.477561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.497167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.497397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.497495 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kk6f\" (UniqueName: \"kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.497547 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.499162 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:37.999130719 +0000 UTC m=+149.571734458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.563633 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.564711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.595813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.595849 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599241 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kk6f\" (UniqueName: \"kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4xtb\" (UniqueName: \"kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.599509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.600568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.600866 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.100847959 +0000 UTC m=+149.673451688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.601863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.604973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.607394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.670066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kk6f\" (UniqueName: \"kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f\") pod \"certified-operators-hx4v9\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.700662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.700930 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.200896588 +0000 UTC m=+149.773500327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.701069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.701229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.701310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4xtb\" (UniqueName: \"kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.701494 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.703229 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.203212197 +0000 UTC m=+149.775815936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.704162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.704217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.740754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.772699 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.775280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.776352 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.806058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.806658 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.806683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4b5\" (UniqueName: \"kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.806788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.806922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.306903775 +0000 UTC m=+149.879507514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.808693 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4xtb\" (UniqueName: \"kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb\") pod \"community-operators-2s2bq\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.886743 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.889477 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.905196 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.905271 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.905200 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.905724 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.910482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.910536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.910564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4b5\" (UniqueName: \"kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.910591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:37 crc kubenswrapper[4795]: E1205 08:26:37.910994 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.410978853 +0000 UTC m=+149.983582592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.911341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.911672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.911716 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:37 crc kubenswrapper[4795]: I1205 08:26:37.911765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.012442 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.012857 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.512819437 +0000 UTC m=+150.085423176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.013006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.013695 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.513682002 +0000 UTC m=+150.086285741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.027442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4b5\" (UniqueName: \"kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5\") pod \"certified-operators-8rltt\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.046896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"91fe3fa785437337b3918493c30348718770f42b57c7d9e94773937ad0350b05"} Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.051256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" event={"ID":"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7","Type":"ContainerStarted","Data":"4778c8ac28863d182635f74d1501112e00cca560b3b28343b768d82a8b6945bd"} Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.077980 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6s54g" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.122157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.122784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.123251 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.623229733 +0000 UTC m=+150.195833472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.123529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.130088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.176395 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.233850 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.233946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.233970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmpv\" (UniqueName: \"kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.234013 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.237561 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.737538495 +0000 UTC m=+150.310142234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.335197 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.335416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.335463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmpv\" (UniqueName: \"kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.335488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.335905 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.335981 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.835962467 +0000 UTC m=+150.408566206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.336190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.395229 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmpv\" (UniqueName: \"kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv\") pod \"community-operators-6b7vp\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.436640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.437028 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:38.937007416 +0000 UTC m=+150.509611145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.455815 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:38 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:38 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:38 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.455879 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.481910 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.496868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-2v7bs" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.511811 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47r47 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.511888 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.511989 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-47r47 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.512010 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.538294 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.538777 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.038756016 +0000 UTC m=+150.611359755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.640167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.641017 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.140994201 +0000 UTC m=+150.713598120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.751221 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.751673 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.251627753 +0000 UTC m=+150.824231492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.755125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.755452 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.255438997 +0000 UTC m=+150.828042736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.859430 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.859719 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.35968012 +0000 UTC m=+150.932283849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.860153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.860585 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.360577036 +0000 UTC m=+150.933180775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:38 crc kubenswrapper[4795]: I1205 08:26:38.965929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:38 crc kubenswrapper[4795]: E1205 08:26:38.966742 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.466719747 +0000 UTC m=+151.039323486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.068083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.068468 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.568454846 +0000 UTC m=+151.141058585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.103579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e16c3ceebc0d84635467e7ae4bdd3a90ac53743df4dddda6bce5b6c18d055e8a"} Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.139113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6427dc7273af9da034df9045f5615a46fa9e0c11dfe9dd22e24f981258729058"} Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.139178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c137d1b0df458ca85191fd34a0bb50a5c6ccb1ed7eccefe064b4c9d6947d1c73"} Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.155560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8ccd247a1e9fb11602b0192efe69177ac3f690b88e0723aa3a98836f70cdcbe2"} Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.171194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.171835 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.671813534 +0000 UTC m=+151.244417273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.276558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.277004 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.776989596 +0000 UTC m=+151.349593335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.379215 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.381021 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.880995282 +0000 UTC m=+151.453599021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.485521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.486308 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:39.986284258 +0000 UTC m=+151.558887997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.495511 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:39 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:39 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:39 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.495587 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.549144 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.586409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.586714 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.586911 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.086887654 +0000 UTC m=+151.659491393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.587829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.629184 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.688489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.688565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9hg\" (UniqueName: \"kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.688599 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.688734 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.689102 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.189087457 +0000 UTC m=+151.761691206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.766075 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.789390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.789892 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.289860948 +0000 UTC m=+151.862464687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.790222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9hg\" (UniqueName: \"kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.790670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.790927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.791356 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.291347973 +0000 UTC m=+151.863951712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.793586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.792909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.794168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.890105 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.891239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9hg\" (UniqueName: \"kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg\") pod \"redhat-marketplace-pqptl\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.892343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.895635 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:39 crc kubenswrapper[4795]: E1205 08:26:39.896111 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.396089361 +0000 UTC m=+151.968693100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.910516 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-05T08:26:39.549174978Z","Handler":null,"Name":""} Dec 05 08:26:39 crc kubenswrapper[4795]: I1205 08:26:39.916497 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:39.997379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp62b\" (UniqueName: \"kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:39.997418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:39.997456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:39.997492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: E1205 08:26:39.997902 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.497886932 +0000 UTC m=+152.070490671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hpn6h" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.000150 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.099333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.099604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.099720 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.099782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp62b\" (UniqueName: \"kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: E1205 08:26:40.100256 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-05 08:26:40.60023586 +0000 UTC m=+152.172839599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.100727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.100956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.185542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp62b\" (UniqueName: \"kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b\") pod \"redhat-marketplace-5v6t9\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.194564 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.194642 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.196538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fafc7ac433309e1194c4915844f500ccf5e09d13dd4d43f6ad28e71c5d9f2025"} Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.200677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.203396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" event={"ID":"a0ad2af1-a387-49d4-9e6c-3dadfe6800d7","Type":"ContainerStarted","Data":"449d338657a675135df73fa96bb5a78349b067d6217ca8768653899aab823893"} Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.203440 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.215749 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.215837 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.217494 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.217639 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.219510 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.280372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.357821 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.436435 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.436919 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.440219 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:40 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:40 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:40 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.440296 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.512039 4795 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ng26g container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]log ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]etcd ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/max-in-flight-filter ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 05 08:26:40 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 05 08:26:40 crc kubenswrapper[4795]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectcache ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-startinformers ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 05 08:26:40 crc kubenswrapper[4795]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 08:26:40 crc kubenswrapper[4795]: livez check failed Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.512123 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" podUID="4ffe336e-9a69-4b3e-81c7-34bf5333858f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.833782 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.834162 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.856793 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" podStartSLOduration=15.856765041 podStartE2EDuration="15.856765041s" podCreationTimestamp="2025-12-05 08:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:40.656882167 +0000 UTC m=+152.229485906" watchObservedRunningTime="2025-12-05 08:26:40.856765041 +0000 UTC m=+152.429368780" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.858488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.869900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hpn6h\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.901791 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.929463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 05 08:26:40 crc kubenswrapper[4795]: I1205 08:26:40.943464 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.142575 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.155720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.157142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.165672 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.201048 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.237401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.237486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.237530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wkqb\" (UniqueName: \"kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.243137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerStarted","Data":"230f9391e9659d5627a33a2e4388626c108cbfef0f89d26d5ea04eb34081cb14"} Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.314952 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx4v9" event={"ID":"34467c19-ae33-49c1-871d-b2499252f0dd","Type":"ContainerStarted","Data":"399874e26ae2dcaaa6316d6068037f74a2ce9ecbff96bb5c8c46fabca17bb902"} Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.344042 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerStarted","Data":"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40"} Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.344106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerStarted","Data":"fa4a1d58753cbaee96412258304543c657b5ba86262da1c33b7e60f4ea0ad81d"} Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.349561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.349738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wkqb\" (UniqueName: \"kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.349796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.350175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.350272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.381683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerStarted","Data":"d92f75c1a2b09a4dd3ed94c3ec0ff22f9bc096caaaed9162169a50333238ec36"} Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.441124 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:41 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:41 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:41 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.441201 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.529683 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.548607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.553436 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.553488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.553515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wdc\" (UniqueName: \"kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.590499 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.654337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.654441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.654468 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wdc\" (UniqueName: \"kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.655558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.656050 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.716443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wkqb\" (UniqueName: \"kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb\") pod \"redhat-operators-7khpv\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.717410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.724457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wdc\" (UniqueName: \"kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc\") pod \"redhat-operators-cmh44\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.781982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.829571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.879989 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.913089 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.935078 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.938596 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.938925 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 05 08:26:41 crc kubenswrapper[4795]: I1205 08:26:41.952242 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.036190 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.063542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.063604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.169985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.170105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.170185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.221504 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.273918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.369024 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.406502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" event={"ID":"8b4264f3-206f-4bb6-ba2f-8ba2fa485060","Type":"ContainerStarted","Data":"6d01e12402f478a5a58c8ccb8cef425655ec0c1e4722d9022d83ac755abae530"} Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.416989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerStarted","Data":"927a540000ad56769f6d17fb9f0fc6388f498a02a87e03d9d11f3b1cbf4f3e44"} Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.422867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerStarted","Data":"76209e12dbd8d75ea3792d8b13d7a83670afc93ae322649f21f858466a2e7d0b"} Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.425178 4795 generic.go:334] "Generic (PLEG): container finished" podID="34467c19-ae33-49c1-871d-b2499252f0dd" containerID="c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373" exitCode=0 Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.425559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx4v9" event={"ID":"34467c19-ae33-49c1-871d-b2499252f0dd","Type":"ContainerDied","Data":"c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373"} Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.435516 4795 generic.go:334] "Generic (PLEG): container finished" podID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerID="d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40" exitCode=0 Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.436400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerDied","Data":"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40"} Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.447875 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:42 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:42 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:42 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.447942 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.754561 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.816788 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.921254 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.934995 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ng26g" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.990258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.991077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.995113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 08:26:42 crc kubenswrapper[4795]: I1205 08:26:42.999521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.036774 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.038939 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.090255 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.090362 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.192195 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.192321 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.192315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.241449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.273979 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dknzh" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.309566 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.439551 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:43 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:43 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:43 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:43 crc kubenswrapper[4795]: I1205 08:26:43.439617 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:44 crc kubenswrapper[4795]: I1205 08:26:44.438370 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:44 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:44 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:44 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:44 crc kubenswrapper[4795]: I1205 08:26:44.438447 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.012091 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.442114 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xcg4r container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 05 08:26:45 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Dec 05 08:26:45 crc kubenswrapper[4795]: [+]process-running ok Dec 05 08:26:45 crc kubenswrapper[4795]: healthz check failed Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.442577 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xcg4r" podUID="44511bda-0717-4c08-adf2-7dd984e85120" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.465085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7khpv" event={"ID":"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1","Type":"ContainerStarted","Data":"be2f2b9c09e9f723e152f397f1aeb53735b83880628a739f3bc7df4c7252fe11"} Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.466409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmh44" event={"ID":"12857aab-68ed-47de-9796-c247fee349a3","Type":"ContainerStarted","Data":"926f965bcdd7a62547e3808682dafe23ffdd3a883e123d06d1ab089fb3d12858"} Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.593340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 05 08:26:45 crc kubenswrapper[4795]: W1205 08:26:45.603310 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1363bf31_33a0_43f8_9bef_aac7bfd8bd71.slice/crio-57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796 WatchSource:0}: Error finding container 57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796: Status 404 returned error can't find the container with id 57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796 Dec 05 08:26:45 crc kubenswrapper[4795]: I1205 08:26:45.645867 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.008039 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-bfv9k" podUID="a0ad2af1-a387-49d4-9e6c-3dadfe6800d7" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.503085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32759673-d0a2-4d01-9c90-51c94da9cdc1","Type":"ContainerStarted","Data":"d8b1cfebea29f071f684cfd416b666e3cd1f04122d4cbf4aeb69912d48461a6c"} Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.513543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerStarted","Data":"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55"} Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.515186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1363bf31-33a0-43f8-9bef-aac7bfd8bd71","Type":"ContainerStarted","Data":"57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796"} Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.516501 4795 generic.go:334] "Generic (PLEG): container finished" podID="9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" containerID="ccc4fa14d91dac69258353003e0ef4116337a90fa2a6d3594c4e85764a686450" exitCode=0 Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.516535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" event={"ID":"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821","Type":"ContainerDied","Data":"ccc4fa14d91dac69258353003e0ef4116337a90fa2a6d3594c4e85764a686450"} Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.610775 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:46 crc kubenswrapper[4795]: I1205 08:26:46.619544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xcg4r" Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.386548 4795 patch_prober.go:28] interesting pod/console-f9d7485db-r8zdl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.386969 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-r8zdl" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.524843 4795 generic.go:334] "Generic (PLEG): container finished" podID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerID="6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55" exitCode=0 Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.524939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerDied","Data":"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55"} Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.526224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerStarted","Data":"09be6c20c5d35d94c82939750df346feb2ef4a9ba070b26f647784ddfa41f3ec"} Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.886571 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.886665 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.886673 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.886731 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:47 crc kubenswrapper[4795]: I1205 08:26:47.949306 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.075994 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume\") pod \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.076495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29rg5\" (UniqueName: \"kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5\") pod \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.076575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume\") pod \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\" (UID: \"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821\") " Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.077443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume" (OuterVolumeSpecName: "config-volume") pod "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" (UID: "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.097784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" (UID: "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.103218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5" (OuterVolumeSpecName: "kube-api-access-29rg5") pod "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" (UID: "9da05ac8-31e6-4fb6-b8d4-b10d5cc26821"). InnerVolumeSpecName "kube-api-access-29rg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.178928 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29rg5\" (UniqueName: \"kubernetes.io/projected/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-kube-api-access-29rg5\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.178976 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.178993 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.509899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.532481 4795 generic.go:334] "Generic (PLEG): container finished" podID="22f937a2-2821-4972-b16e-a266a8a3a837" containerID="09be6c20c5d35d94c82939750df346feb2ef4a9ba070b26f647784ddfa41f3ec" exitCode=0 Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.532556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerDied","Data":"09be6c20c5d35d94c82939750df346feb2ef4a9ba070b26f647784ddfa41f3ec"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.534313 4795 generic.go:334] "Generic (PLEG): container finished" podID="12857aab-68ed-47de-9796-c247fee349a3" containerID="eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0" exitCode=0 Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.534367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmh44" event={"ID":"12857aab-68ed-47de-9796-c247fee349a3","Type":"ContainerDied","Data":"eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.540841 4795 generic.go:334] "Generic (PLEG): container finished" podID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerID="11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3" exitCode=0 Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.540960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerDied","Data":"11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.543309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" event={"ID":"9da05ac8-31e6-4fb6-b8d4-b10d5cc26821","Type":"ContainerDied","Data":"5543b7dccf9e7de1d3db3465bb8b3dfd3581a34f8e00faab1996c776858d7eec"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.543383 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5543b7dccf9e7de1d3db3465bb8b3dfd3581a34f8e00faab1996c776858d7eec" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.543340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t" Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.545263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32759673-d0a2-4d01-9c90-51c94da9cdc1","Type":"ContainerStarted","Data":"2c56ba1192a0b2739bf3cf6f94b784229204af95b56d8d0444dbbe8163014669"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.546767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1363bf31-33a0-43f8-9bef-aac7bfd8bd71","Type":"ContainerStarted","Data":"1fc8206bf32c9353ac6ffca305f4e49297747397ada13db9c233ec452cb164d1"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.548023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" event={"ID":"8b4264f3-206f-4bb6-ba2f-8ba2fa485060","Type":"ContainerStarted","Data":"4518276949a843b0206ff095d98d6173ff5a7941be07ca04a5cc1bff846108b8"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.549289 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerID="9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52" exitCode=0 Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.549356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7khpv" event={"ID":"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1","Type":"ContainerDied","Data":"9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52"} Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.550675 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerID="b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884" exitCode=0 Dec 05 08:26:48 crc kubenswrapper[4795]: I1205 08:26:48.551480 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerDied","Data":"b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884"} Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.570651 4795 generic.go:334] "Generic (PLEG): container finished" podID="32759673-d0a2-4d01-9c90-51c94da9cdc1" containerID="2c56ba1192a0b2739bf3cf6f94b784229204af95b56d8d0444dbbe8163014669" exitCode=0 Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.570769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32759673-d0a2-4d01-9c90-51c94da9cdc1","Type":"ContainerDied","Data":"2c56ba1192a0b2739bf3cf6f94b784229204af95b56d8d0444dbbe8163014669"} Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.571484 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.616590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.623450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6c9f96ec-f615-4030-a78d-2dd56932c6c1-metrics-certs\") pod \"network-metrics-daemon-8cnbm\" (UID: \"6c9f96ec-f615-4030-a78d-2dd56932c6c1\") " pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.660039 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=9.660006208 podStartE2EDuration="9.660006208s" podCreationTimestamp="2025-12-05 08:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:50.655392062 +0000 UTC m=+162.227995801" watchObservedRunningTime="2025-12-05 08:26:50.660006208 +0000 UTC m=+162.232609957" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.670966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8cnbm" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.677886 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" podStartSLOduration=143.677863457 podStartE2EDuration="2m23.677863457s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:50.675514747 +0000 UTC m=+162.248118486" watchObservedRunningTime="2025-12-05 08:26:50.677863457 +0000 UTC m=+162.250467196" Dec 05 08:26:50 crc kubenswrapper[4795]: I1205 08:26:50.967184 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8cnbm"] Dec 05 08:26:51 crc kubenswrapper[4795]: I1205 08:26:51.582553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" event={"ID":"6c9f96ec-f615-4030-a78d-2dd56932c6c1","Type":"ContainerStarted","Data":"07ae9a448694897e6d6f9f36da7edc150efd8b9ea8ce41ac729bdcaaebdf4640"} Dec 05 08:26:51 crc kubenswrapper[4795]: I1205 08:26:51.585336 4795 generic.go:334] "Generic (PLEG): container finished" podID="1363bf31-33a0-43f8-9bef-aac7bfd8bd71" containerID="1fc8206bf32c9353ac6ffca305f4e49297747397ada13db9c233ec452cb164d1" exitCode=0 Dec 05 08:26:51 crc kubenswrapper[4795]: I1205 08:26:51.585570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1363bf31-33a0-43f8-9bef-aac7bfd8bd71","Type":"ContainerDied","Data":"1fc8206bf32c9353ac6ffca305f4e49297747397ada13db9c233ec452cb164d1"} Dec 05 08:26:51 crc kubenswrapper[4795]: I1205 08:26:51.990828 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.140176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir\") pod \"32759673-d0a2-4d01-9c90-51c94da9cdc1\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.140315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "32759673-d0a2-4d01-9c90-51c94da9cdc1" (UID: "32759673-d0a2-4d01-9c90-51c94da9cdc1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.140418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access\") pod \"32759673-d0a2-4d01-9c90-51c94da9cdc1\" (UID: \"32759673-d0a2-4d01-9c90-51c94da9cdc1\") " Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.141027 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32759673-d0a2-4d01-9c90-51c94da9cdc1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.148596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "32759673-d0a2-4d01-9c90-51c94da9cdc1" (UID: "32759673-d0a2-4d01-9c90-51c94da9cdc1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.242204 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32759673-d0a2-4d01-9c90-51c94da9cdc1-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.605900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" event={"ID":"6c9f96ec-f615-4030-a78d-2dd56932c6c1","Type":"ContainerStarted","Data":"52dd3559e45be4da6429798d60b369891fb2980519c01a599b9662097eb2cf2f"} Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.609198 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.610798 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"32759673-d0a2-4d01-9c90-51c94da9cdc1","Type":"ContainerDied","Data":"d8b1cfebea29f071f684cfd416b666e3cd1f04122d4cbf4aeb69912d48461a6c"} Dec 05 08:26:52 crc kubenswrapper[4795]: I1205 08:26:52.610829 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b1cfebea29f071f684cfd416b666e3cd1f04122d4cbf4aeb69912d48461a6c" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.065048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.161220 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access\") pod \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.161400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir\") pod \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\" (UID: \"1363bf31-33a0-43f8-9bef-aac7bfd8bd71\") " Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.161476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1363bf31-33a0-43f8-9bef-aac7bfd8bd71" (UID: "1363bf31-33a0-43f8-9bef-aac7bfd8bd71"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.183197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1363bf31-33a0-43f8-9bef-aac7bfd8bd71" (UID: "1363bf31-33a0-43f8-9bef-aac7bfd8bd71"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.265660 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.265736 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1363bf31-33a0-43f8-9bef-aac7bfd8bd71-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.637426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1363bf31-33a0-43f8-9bef-aac7bfd8bd71","Type":"ContainerDied","Data":"57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796"} Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.637751 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 05 08:26:53 crc kubenswrapper[4795]: I1205 08:26:53.639344 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57222391aa141b3a647f205bc73a88d044cbc3e04274d30335e331b8788bf796" Dec 05 08:26:54 crc kubenswrapper[4795]: I1205 08:26:54.648844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8cnbm" event={"ID":"6c9f96ec-f615-4030-a78d-2dd56932c6c1","Type":"ContainerStarted","Data":"5dc5c9d171e053f7835d32d52857425d3e77992e800c639a7f4119a299371823"} Dec 05 08:26:55 crc kubenswrapper[4795]: I1205 08:26:55.689526 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8cnbm" podStartSLOduration=148.689469638 podStartE2EDuration="2m28.689469638s" podCreationTimestamp="2025-12-05 08:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:26:55.683078748 +0000 UTC m=+167.255682497" watchObservedRunningTime="2025-12-05 08:26:55.689469638 +0000 UTC m=+167.262073387" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.398344 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.405319 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.892308 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.892374 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.892431 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.892308 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.892750 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.893560 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.893591 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.893709 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"d510b8f9c85b2fd239a150339c8515456d849b5c54ecb28507cdac1871f1ef99"} pod="openshift-console/downloads-7954f5f757-qsf8w" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 05 08:26:57 crc kubenswrapper[4795]: I1205 08:26:57.893802 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" containerID="cri-o://d510b8f9c85b2fd239a150339c8515456d849b5c54ecb28507cdac1871f1ef99" gracePeriod=2 Dec 05 08:26:58 crc kubenswrapper[4795]: I1205 08:26:58.701255 4795 generic.go:334] "Generic (PLEG): container finished" podID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerID="d510b8f9c85b2fd239a150339c8515456d849b5c54ecb28507cdac1871f1ef99" exitCode=0 Dec 05 08:26:58 crc kubenswrapper[4795]: I1205 08:26:58.701343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qsf8w" event={"ID":"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d","Type":"ContainerDied","Data":"d510b8f9c85b2fd239a150339c8515456d849b5c54ecb28507cdac1871f1ef99"} Dec 05 08:27:07 crc kubenswrapper[4795]: I1205 08:27:07.168532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-ckn4t" Dec 05 08:27:07 crc kubenswrapper[4795]: I1205 08:27:07.886228 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:07 crc kubenswrapper[4795]: I1205 08:27:07.886708 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:10 crc kubenswrapper[4795]: I1205 08:27:10.827115 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:27:10 crc kubenswrapper[4795]: I1205 08:27:10.827589 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:27:10 crc kubenswrapper[4795]: I1205 08:27:10.908033 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:27:16 crc kubenswrapper[4795]: I1205 08:27:16.064319 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 05 08:27:17 crc kubenswrapper[4795]: I1205 08:27:17.887186 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:17 crc kubenswrapper[4795]: I1205 08:27:17.887327 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.165102 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 08:27:19 crc kubenswrapper[4795]: E1205 08:27:19.166078 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" containerName="collect-profiles" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166107 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" containerName="collect-profiles" Dec 05 08:27:19 crc kubenswrapper[4795]: E1205 08:27:19.166150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32759673-d0a2-4d01-9c90-51c94da9cdc1" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166163 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32759673-d0a2-4d01-9c90-51c94da9cdc1" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: E1205 08:27:19.166180 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1363bf31-33a0-43f8-9bef-aac7bfd8bd71" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1363bf31-33a0-43f8-9bef-aac7bfd8bd71" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166362 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1363bf31-33a0-43f8-9bef-aac7bfd8bd71" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166394 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" containerName="collect-profiles" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.166410 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32759673-d0a2-4d01-9c90-51c94da9cdc1" containerName="pruner" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.167183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.170384 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.172293 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.176084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.315541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.315652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.417295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.417416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.417470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.443744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:19 crc kubenswrapper[4795]: I1205 08:27:19.503879 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.562638 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.563860 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.583131 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.695413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.695938 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.695963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.797538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.797671 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.797718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.797811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.798250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.819978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access\") pod \"installer-9-crc\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:24 crc kubenswrapper[4795]: I1205 08:27:24.900408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:27:27 crc kubenswrapper[4795]: I1205 08:27:27.887437 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:27 crc kubenswrapper[4795]: I1205 08:27:27.887535 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.937539 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.939684 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ls9hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pqptl_openshift-marketplace(2d3b856d-6654-4b3f-8b42-92c92e968f86): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.940575 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475: Get \"https://registry.redhat.io/v2/redhat/community-operator-index/blobs/sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475\": context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.940764 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4xtb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2s2bq_openshift-marketplace(22f937a2-2821-4972-b16e-a266a8a3a837): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475: Get \"https://registry.redhat.io/v2/redhat/community-operator-index/blobs/sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475\": context canceled" logger="UnhandledError" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.941851 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pqptl" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.941987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475: Get \\\"https://registry.redhat.io/v2/redhat/community-operator-index/blobs/sha256:fcd9cdaeec4d21f010a2bb25043386ef71e3c6ca9c62aaf284b705dd309b1475\\\": context canceled\"" pod="openshift-marketplace/community-operators-2s2bq" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.975059 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.975602 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mp62b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5v6t9_openshift-marketplace(e0b9f734-3994-4f64-92c4-86b71233f20a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:31 crc kubenswrapper[4795]: E1205 08:27:31.977037 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5v6t9" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" Dec 05 08:27:37 crc kubenswrapper[4795]: I1205 08:27:37.886643 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:37 crc kubenswrapper[4795]: I1205 08:27:37.887486 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:38 crc kubenswrapper[4795]: E1205 08:27:38.643252 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 08:27:38 crc kubenswrapper[4795]: E1205 08:27:38.643989 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86wdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cmh44_openshift-marketplace(12857aab-68ed-47de-9796-c247fee349a3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:38 crc kubenswrapper[4795]: E1205 08:27:38.645194 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cmh44" podUID="12857aab-68ed-47de-9796-c247fee349a3" Dec 05 08:27:40 crc kubenswrapper[4795]: E1205 08:27:40.669847 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 08:27:40 crc kubenswrapper[4795]: E1205 08:27:40.671184 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lt4b5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-8rltt_openshift-marketplace(0e74a7c4-6d12-4252-9d93-e8950c9a7e46): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:40 crc kubenswrapper[4795]: E1205 08:27:40.673085 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-8rltt" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" Dec 05 08:27:40 crc kubenswrapper[4795]: I1205 08:27:40.827521 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:27:40 crc kubenswrapper[4795]: I1205 08:27:40.827595 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:27:40 crc kubenswrapper[4795]: I1205 08:27:40.827672 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:27:40 crc kubenswrapper[4795]: I1205 08:27:40.828325 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:27:40 crc kubenswrapper[4795]: I1205 08:27:40.828403 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca" gracePeriod=600 Dec 05 08:27:41 crc kubenswrapper[4795]: I1205 08:27:41.195762 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca" exitCode=0 Dec 05 08:27:41 crc kubenswrapper[4795]: I1205 08:27:41.195893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca"} Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.681584 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.682230 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2mmpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6b7vp_openshift-marketplace(0a149548-294b-4acf-9c86-c036a0ce0fa4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.683461 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6b7vp" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.697707 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.697900 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wkqb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7khpv_openshift-marketplace(8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.699387 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7khpv" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.728960 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.729519 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9kk6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hx4v9_openshift-marketplace(34467c19-ae33-49c1-871d-b2499252f0dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:27:42 crc kubenswrapper[4795]: E1205 08:27:42.732011 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hx4v9" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.080511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.201461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 05 08:27:43 crc kubenswrapper[4795]: W1205 08:27:43.222761 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1b992d50_5e09_444d_813a_a2c4cfa25e05.slice/crio-54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150 WatchSource:0}: Error finding container 54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150: Status 404 returned error can't find the container with id 54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150 Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.231724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453"} Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.236866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"09276d48-d8ca-41e5-bff2-6f620f68d53d","Type":"ContainerStarted","Data":"4a3504eaea835b005687884d56815fab4126e79150d6c7a34684b90014c998d8"} Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.247095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qsf8w" event={"ID":"245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d","Type":"ContainerStarted","Data":"0a02c530424512fa2923f9d6a12a1def2038d6ef554e61beb991baf6da3d8b86"} Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.247153 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.247724 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:43 crc kubenswrapper[4795]: I1205 08:27:43.247830 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:43 crc kubenswrapper[4795]: E1205 08:27:43.264425 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6b7vp" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" Dec 05 08:27:43 crc kubenswrapper[4795]: E1205 08:27:43.267866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hx4v9" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" Dec 05 08:27:44 crc kubenswrapper[4795]: I1205 08:27:44.250833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1b992d50-5e09-444d-813a-a2c4cfa25e05","Type":"ContainerStarted","Data":"54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150"} Dec 05 08:27:44 crc kubenswrapper[4795]: I1205 08:27:44.253037 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"09276d48-d8ca-41e5-bff2-6f620f68d53d","Type":"ContainerStarted","Data":"b4df3a0994205d29780f806ae88f9f4929c963561ca6abcfcd6c2e92bb0873ec"} Dec 05 08:27:44 crc kubenswrapper[4795]: I1205 08:27:44.253860 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:44 crc kubenswrapper[4795]: I1205 08:27:44.253920 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:45 crc kubenswrapper[4795]: I1205 08:27:45.260169 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1b992d50-5e09-444d-813a-a2c4cfa25e05","Type":"ContainerStarted","Data":"5a47b9e6bf24fb7d269c30cde35c4b7a723dc17335fab4448f2425ada5b778d3"} Dec 05 08:27:45 crc kubenswrapper[4795]: I1205 08:27:45.294559 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=21.294538832 podStartE2EDuration="21.294538832s" podCreationTimestamp="2025-12-05 08:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:27:45.292990585 +0000 UTC m=+216.865594324" watchObservedRunningTime="2025-12-05 08:27:45.294538832 +0000 UTC m=+216.867142571" Dec 05 08:27:45 crc kubenswrapper[4795]: I1205 08:27:45.295175 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=26.29516912 podStartE2EDuration="26.29516912s" podCreationTimestamp="2025-12-05 08:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:27:45.277036192 +0000 UTC m=+216.849639931" watchObservedRunningTime="2025-12-05 08:27:45.29516912 +0000 UTC m=+216.867772859" Dec 05 08:27:46 crc kubenswrapper[4795]: I1205 08:27:46.267308 4795 generic.go:334] "Generic (PLEG): container finished" podID="09276d48-d8ca-41e5-bff2-6f620f68d53d" containerID="b4df3a0994205d29780f806ae88f9f4929c963561ca6abcfcd6c2e92bb0873ec" exitCode=0 Dec 05 08:27:46 crc kubenswrapper[4795]: I1205 08:27:46.268697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"09276d48-d8ca-41e5-bff2-6f620f68d53d","Type":"ContainerDied","Data":"b4df3a0994205d29780f806ae88f9f4929c963561ca6abcfcd6c2e92bb0873ec"} Dec 05 08:27:47 crc kubenswrapper[4795]: I1205 08:27:47.886005 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:47 crc kubenswrapper[4795]: I1205 08:27:47.886503 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:47 crc kubenswrapper[4795]: I1205 08:27:47.886066 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:47 crc kubenswrapper[4795]: I1205 08:27:47.886629 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.536704 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.594138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir\") pod \"09276d48-d8ca-41e5-bff2-6f620f68d53d\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.594340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access\") pod \"09276d48-d8ca-41e5-bff2-6f620f68d53d\" (UID: \"09276d48-d8ca-41e5-bff2-6f620f68d53d\") " Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.595817 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "09276d48-d8ca-41e5-bff2-6f620f68d53d" (UID: "09276d48-d8ca-41e5-bff2-6f620f68d53d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.617649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "09276d48-d8ca-41e5-bff2-6f620f68d53d" (UID: "09276d48-d8ca-41e5-bff2-6f620f68d53d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.698025 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09276d48-d8ca-41e5-bff2-6f620f68d53d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:27:48 crc kubenswrapper[4795]: I1205 08:27:48.698084 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09276d48-d8ca-41e5-bff2-6f620f68d53d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:27:49 crc kubenswrapper[4795]: I1205 08:27:49.290292 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"09276d48-d8ca-41e5-bff2-6f620f68d53d","Type":"ContainerDied","Data":"4a3504eaea835b005687884d56815fab4126e79150d6c7a34684b90014c998d8"} Dec 05 08:27:49 crc kubenswrapper[4795]: I1205 08:27:49.290345 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3504eaea835b005687884d56815fab4126e79150d6c7a34684b90014c998d8" Dec 05 08:27:49 crc kubenswrapper[4795]: I1205 08:27:49.290464 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 05 08:27:52 crc kubenswrapper[4795]: I1205 08:27:52.309105 4795 generic.go:334] "Generic (PLEG): container finished" podID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerID="6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe" exitCode=0 Dec 05 08:27:52 crc kubenswrapper[4795]: I1205 08:27:52.309172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerDied","Data":"6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe"} Dec 05 08:27:52 crc kubenswrapper[4795]: I1205 08:27:52.312919 4795 generic.go:334] "Generic (PLEG): container finished" podID="22f937a2-2821-4972-b16e-a266a8a3a837" containerID="33fb7f0f7da29f49c9d899994d484528aa10df41ec5916c289fc753a6dc8b848" exitCode=0 Dec 05 08:27:52 crc kubenswrapper[4795]: I1205 08:27:52.312954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerDied","Data":"33fb7f0f7da29f49c9d899994d484528aa10df41ec5916c289fc753a6dc8b848"} Dec 05 08:27:55 crc kubenswrapper[4795]: I1205 08:27:55.344206 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerID="a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9" exitCode=0 Dec 05 08:27:55 crc kubenswrapper[4795]: I1205 08:27:55.344251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerDied","Data":"a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9"} Dec 05 08:27:57 crc kubenswrapper[4795]: I1205 08:27:57.886743 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:57 crc kubenswrapper[4795]: I1205 08:27:57.886831 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:57 crc kubenswrapper[4795]: I1205 08:27:57.887313 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-qsf8w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 05 08:27:57 crc kubenswrapper[4795]: I1205 08:27:57.887386 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qsf8w" podUID="245fcccb-1a04-4a13-8aca-b9bcbbf1ee2d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 05 08:27:58 crc kubenswrapper[4795]: I1205 08:27:58.927127 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sb6fd"] Dec 05 08:27:58 crc kubenswrapper[4795]: E1205 08:27:58.927959 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09276d48-d8ca-41e5-bff2-6f620f68d53d" containerName="pruner" Dec 05 08:27:58 crc kubenswrapper[4795]: I1205 08:27:58.927979 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="09276d48-d8ca-41e5-bff2-6f620f68d53d" containerName="pruner" Dec 05 08:27:58 crc kubenswrapper[4795]: I1205 08:27:58.928121 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="09276d48-d8ca-41e5-bff2-6f620f68d53d" containerName="pruner" Dec 05 08:27:58 crc kubenswrapper[4795]: I1205 08:27:58.928739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:58 crc kubenswrapper[4795]: I1205 08:27:58.970753 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sb6fd"] Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-trusted-ca\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhmfp\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-kube-api-access-rhmfp\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-bound-sa-token\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055687 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91b331a6-6a6f-443d-a101-86e642f45659-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91b331a6-6a6f-443d-a101-86e642f45659-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-registry-certificates\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.055870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-registry-tls\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.088995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91b331a6-6a6f-443d-a101-86e642f45659-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91b331a6-6a6f-443d-a101-86e642f45659-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-registry-certificates\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-registry-tls\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157851 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-trusted-ca\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhmfp\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-kube-api-access-rhmfp\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.157890 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-bound-sa-token\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.158472 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/91b331a6-6a6f-443d-a101-86e642f45659-ca-trust-extracted\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.159197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-registry-certificates\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.159938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/91b331a6-6a6f-443d-a101-86e642f45659-trusted-ca\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.165980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/91b331a6-6a6f-443d-a101-86e642f45659-installation-pull-secrets\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.168454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-registry-tls\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.177977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-bound-sa-token\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.182378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhmfp\" (UniqueName: \"kubernetes.io/projected/91b331a6-6a6f-443d-a101-86e642f45659-kube-api-access-rhmfp\") pod \"image-registry-66df7c8f76-sb6fd\" (UID: \"91b331a6-6a6f-443d-a101-86e642f45659\") " pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:27:59 crc kubenswrapper[4795]: I1205 08:27:59.262181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:28:06 crc kubenswrapper[4795]: I1205 08:28:06.858541 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tn798"] Dec 05 08:28:07 crc kubenswrapper[4795]: I1205 08:28:07.893009 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qsf8w" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.081399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-sb6fd"] Dec 05 08:28:08 crc kubenswrapper[4795]: W1205 08:28:08.114147 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91b331a6_6a6f_443d_a101_86e642f45659.slice/crio-b8207937972f24742d54a7480d2297c1b4f13ffd63348f233b765ba87718ae91 WatchSource:0}: Error finding container b8207937972f24742d54a7480d2297c1b4f13ffd63348f233b765ba87718ae91: Status 404 returned error can't find the container with id b8207937972f24742d54a7480d2297c1b4f13ffd63348f233b765ba87718ae91 Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.480941 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.483747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerStarted","Data":"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.496882 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmh44" event={"ID":"12857aab-68ed-47de-9796-c247fee349a3","Type":"ContainerStarted","Data":"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.498897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.508993 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerStarted","Data":"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.523222 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.523301 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.543999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerStarted","Data":"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.544251 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8rltt" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-content" containerID="cri-o://2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4" gracePeriod=30 Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.554124 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqptl" podStartSLOduration=12.132356151 podStartE2EDuration="1m29.554103804s" podCreationTimestamp="2025-12-05 08:26:39 +0000 UTC" firstStartedPulling="2025-12-05 08:26:50.572777418 +0000 UTC m=+162.145381157" lastFinishedPulling="2025-12-05 08:28:07.994525071 +0000 UTC m=+239.567128810" observedRunningTime="2025-12-05 08:28:08.535975815 +0000 UTC m=+240.108579554" watchObservedRunningTime="2025-12-05 08:28:08.554103804 +0000 UTC m=+240.126707543" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.556235 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.556462 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" containerID="cri-o://99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a" gracePeriod=30 Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.571510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" event={"ID":"91b331a6-6a6f-443d-a101-86e642f45659","Type":"ContainerStarted","Data":"a7d8d30746cf6c2b6beaacb57414c810699a424fa7f9c87c1a09a5d5f6b92e32"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.571562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" event={"ID":"91b331a6-6a6f-443d-a101-86e642f45659","Type":"ContainerStarted","Data":"b8207937972f24742d54a7480d2297c1b4f13ffd63348f233b765ba87718ae91"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.572357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.573942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.580421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7khpv" event={"ID":"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1","Type":"ContainerStarted","Data":"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.596356 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5v6t9" podStartSLOduration=11.677436109 podStartE2EDuration="1m29.596334958s" podCreationTimestamp="2025-12-05 08:26:39 +0000 UTC" firstStartedPulling="2025-12-05 08:26:49.566676783 +0000 UTC m=+161.139280522" lastFinishedPulling="2025-12-05 08:28:07.485575632 +0000 UTC m=+239.058179371" observedRunningTime="2025-12-05 08:28:08.593906816 +0000 UTC m=+240.166510555" watchObservedRunningTime="2025-12-05 08:28:08.596334958 +0000 UTC m=+240.168938697" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.606549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerStarted","Data":"7cdbf6a797cb12d6ddaf8dcc3c7544ba911c5629dc0abb41342c124db4b7846a"} Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.606798 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2s2bq" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="registry-server" containerID="cri-o://7cdbf6a797cb12d6ddaf8dcc3c7544ba911c5629dc0abb41342c124db4b7846a" gracePeriod=30 Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.618257 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nz6vn"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.619048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.635449 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.668335 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.703584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nz6vn"] Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.719203 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2s2bq" podStartSLOduration=24.572806934 podStartE2EDuration="1m31.719179377s" podCreationTimestamp="2025-12-05 08:26:37 +0000 UTC" firstStartedPulling="2025-12-05 08:26:49.565607162 +0000 UTC m=+161.138210901" lastFinishedPulling="2025-12-05 08:27:56.711979605 +0000 UTC m=+228.284583344" observedRunningTime="2025-12-05 08:28:08.707259722 +0000 UTC m=+240.279863461" watchObservedRunningTime="2025-12-05 08:28:08.719179377 +0000 UTC m=+240.291783116" Dec 05 08:28:08 crc kubenswrapper[4795]: I1205 08:28:08.724164 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.870649 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.870787 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86r27\" (UniqueName: \"kubernetes.io/projected/5e854247-6adf-4f84-96b8-083f8772d8eb-kube-api-access-86r27\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.870884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.973736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.973816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86r27\" (UniqueName: \"kubernetes.io/projected/5e854247-6adf-4f84-96b8-083f8772d8eb-kube-api-access-86r27\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.973870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.979649 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" podStartSLOduration=10.979627803 podStartE2EDuration="10.979627803s" podCreationTimestamp="2025-12-05 08:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:28:08.931024759 +0000 UTC m=+240.503628498" watchObservedRunningTime="2025-12-05 08:28:08.979627803 +0000 UTC m=+240.552231542" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.987955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:08.991738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e854247-6adf-4f84-96b8-083f8772d8eb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.002078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86r27\" (UniqueName: \"kubernetes.io/projected/5e854247-6adf-4f84-96b8-083f8772d8eb-kube-api-access-86r27\") pod \"marketplace-operator-79b997595-nz6vn\" (UID: \"5e854247-6adf-4f84-96b8-083f8772d8eb\") " pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.237802 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.527495 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.620354 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8rltt_0e74a7c4-6d12-4252-9d93-e8950c9a7e46/extract-content/0.log" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.626986 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.631114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx4v9" event={"ID":"34467c19-ae33-49c1-871d-b2499252f0dd","Type":"ContainerStarted","Data":"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.631292 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hx4v9" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-content" containerID="cri-o://0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.643774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4kw\" (UniqueName: \"kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw\") pod \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.643885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca\") pod \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.643948 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics\") pod \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\" (UID: \"0849cac0-adb5-41b4-a67a-3f7dc195e78a\") " Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.648515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0849cac0-adb5-41b4-a67a-3f7dc195e78a" (UID: "0849cac0-adb5-41b4-a67a-3f7dc195e78a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.648730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerStarted","Data":"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.649037 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6b7vp" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-content" containerID="cri-o://314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.665816 4795 generic.go:334] "Generic (PLEG): container finished" podID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerID="99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a" exitCode=0 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.665896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" event={"ID":"0849cac0-adb5-41b4-a67a-3f7dc195e78a","Type":"ContainerDied","Data":"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.665933 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" event={"ID":"0849cac0-adb5-41b4-a67a-3f7dc195e78a","Type":"ContainerDied","Data":"79a6032a22553b2d5653ffa59d16fb157c7a2f8b4531b006b0257224cf29dfe1"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.665956 4795 scope.go:117] "RemoveContainer" containerID="99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.666094 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-47r47" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.679653 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8rltt_0e74a7c4-6d12-4252-9d93-e8950c9a7e46/extract-content/0.log" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.680898 4795 generic.go:334] "Generic (PLEG): container finished" podID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerID="2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4" exitCode=2 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.680969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerDied","Data":"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.680999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8rltt" event={"ID":"0e74a7c4-6d12-4252-9d93-e8950c9a7e46","Type":"ContainerDied","Data":"d92f75c1a2b09a4dd3ed94c3ec0ff22f9bc096caaaed9162169a50333238ec36"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.681071 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8rltt" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.681864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0849cac0-adb5-41b4-a67a-3f7dc195e78a" (UID: "0849cac0-adb5-41b4-a67a-3f7dc195e78a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.681924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw" (OuterVolumeSpecName: "kube-api-access-9j4kw") pod "0849cac0-adb5-41b4-a67a-3f7dc195e78a" (UID: "0849cac0-adb5-41b4-a67a-3f7dc195e78a"). InnerVolumeSpecName "kube-api-access-9j4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.694861 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2s2bq_22f937a2-2821-4972-b16e-a266a8a3a837/registry-server/0.log" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.703926 4795 generic.go:334] "Generic (PLEG): container finished" podID="22f937a2-2821-4972-b16e-a266a8a3a837" containerID="7cdbf6a797cb12d6ddaf8dcc3c7544ba911c5629dc0abb41342c124db4b7846a" exitCode=1 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.704355 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cmh44" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-content" containerID="cri-o://af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.704693 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerDied","Data":"7cdbf6a797cb12d6ddaf8dcc3c7544ba911c5629dc0abb41342c124db4b7846a"} Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.704862 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqptl" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="registry-server" containerID="cri-o://e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.705011 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7khpv" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-content" containerID="cri-o://45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.705222 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5v6t9" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="registry-server" containerID="cri-o://c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245" gracePeriod=30 Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.742870 4795 scope.go:117] "RemoveContainer" containerID="99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746090 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt4b5\" (UniqueName: \"kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5\") pod \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746193 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities\") pod \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content\") pod \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\" (UID: \"0e74a7c4-6d12-4252-9d93-e8950c9a7e46\") " Dec 05 08:28:09 crc kubenswrapper[4795]: E1205 08:28:09.746484 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a\": container with ID starting with 99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a not found: ID does not exist" containerID="99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746544 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a"} err="failed to get container status \"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a\": rpc error: code = NotFound desc = could not find container \"99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a\": container with ID starting with 99209622b4c2a3c373f9018eeddab1e2f57e90a547fc975dded158510ea9ac8a not found: ID does not exist" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746579 4795 scope.go:117] "RemoveContainer" containerID="2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746632 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4kw\" (UniqueName: \"kubernetes.io/projected/0849cac0-adb5-41b4-a67a-3f7dc195e78a-kube-api-access-9j4kw\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746651 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.746665 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0849cac0-adb5-41b4-a67a-3f7dc195e78a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.747965 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities" (OuterVolumeSpecName: "utilities") pod "0e74a7c4-6d12-4252-9d93-e8950c9a7e46" (UID: "0e74a7c4-6d12-4252-9d93-e8950c9a7e46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.759674 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5" (OuterVolumeSpecName: "kube-api-access-lt4b5") pod "0e74a7c4-6d12-4252-9d93-e8950c9a7e46" (UID: "0e74a7c4-6d12-4252-9d93-e8950c9a7e46"). InnerVolumeSpecName "kube-api-access-lt4b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.850821 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt4b5\" (UniqueName: \"kubernetes.io/projected/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-kube-api-access-lt4b5\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.850903 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.857374 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e74a7c4-6d12-4252-9d93-e8950c9a7e46" (UID: "0e74a7c4-6d12-4252-9d93-e8950c9a7e46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.917152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.933555 4795 scope.go:117] "RemoveContainer" containerID="6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.955551 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e74a7c4-6d12-4252-9d93-e8950c9a7e46-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.980052 4795 scope.go:117] "RemoveContainer" containerID="2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4" Dec 05 08:28:09 crc kubenswrapper[4795]: E1205 08:28:09.987130 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4\": container with ID starting with 2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4 not found: ID does not exist" containerID="2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.987170 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4"} err="failed to get container status \"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4\": rpc error: code = NotFound desc = could not find container \"2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4\": container with ID starting with 2885681873cf86a3b4c9ea7ea035c55dd21acbef759a46bc80b92271a1ca25a4 not found: ID does not exist" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.987197 4795 scope.go:117] "RemoveContainer" containerID="6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55" Dec 05 08:28:09 crc kubenswrapper[4795]: E1205 08:28:09.989046 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55\": container with ID starting with 6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55 not found: ID does not exist" containerID="6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55" Dec 05 08:28:09 crc kubenswrapper[4795]: I1205 08:28:09.989106 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55"} err="failed to get container status \"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55\": rpc error: code = NotFound desc = could not find container \"6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55\": container with ID starting with 6fc5264d0cfa3cdab05408f42f79b88289200520509592ba6a42bad9b5491e55 not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.010474 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.021110 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-47r47"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.116562 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2s2bq_22f937a2-2821-4972-b16e-a266a8a3a837/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.123878 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.140114 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.157715 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8rltt"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.220977 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.260722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities\") pod \"22f937a2-2821-4972-b16e-a266a8a3a837\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.260777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content\") pod \"22f937a2-2821-4972-b16e-a266a8a3a837\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.260811 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4xtb\" (UniqueName: \"kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb\") pod \"22f937a2-2821-4972-b16e-a266a8a3a837\" (UID: \"22f937a2-2821-4972-b16e-a266a8a3a837\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.261986 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities" (OuterVolumeSpecName: "utilities") pod "22f937a2-2821-4972-b16e-a266a8a3a837" (UID: "22f937a2-2821-4972-b16e-a266a8a3a837"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.265814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb" (OuterVolumeSpecName: "kube-api-access-h4xtb") pod "22f937a2-2821-4972-b16e-a266a8a3a837" (UID: "22f937a2-2821-4972-b16e-a266a8a3a837"). InnerVolumeSpecName "kube-api-access-h4xtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.323474 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nz6vn"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.344886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22f937a2-2821-4972-b16e-a266a8a3a837" (UID: "22f937a2-2821-4972-b16e-a266a8a3a837"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.358971 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hx4v9_34467c19-ae33-49c1-871d-b2499252f0dd/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.359417 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.362044 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.362159 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f937a2-2821-4972-b16e-a266a8a3a837-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.362256 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4xtb\" (UniqueName: \"kubernetes.io/projected/22f937a2-2821-4972-b16e-a266a8a3a837-kube-api-access-h4xtb\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.464194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities\") pod \"34467c19-ae33-49c1-871d-b2499252f0dd\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.464277 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content\") pod \"34467c19-ae33-49c1-871d-b2499252f0dd\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.464342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kk6f\" (UniqueName: \"kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f\") pod \"34467c19-ae33-49c1-871d-b2499252f0dd\" (UID: \"34467c19-ae33-49c1-871d-b2499252f0dd\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.465983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities" (OuterVolumeSpecName: "utilities") pod "34467c19-ae33-49c1-871d-b2499252f0dd" (UID: "34467c19-ae33-49c1-871d-b2499252f0dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.471997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f" (OuterVolumeSpecName: "kube-api-access-9kk6f") pod "34467c19-ae33-49c1-871d-b2499252f0dd" (UID: "34467c19-ae33-49c1-871d-b2499252f0dd"). InnerVolumeSpecName "kube-api-access-9kk6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.557099 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34467c19-ae33-49c1-871d-b2499252f0dd" (UID: "34467c19-ae33-49c1-871d-b2499252f0dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.566073 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.566098 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34467c19-ae33-49c1-871d-b2499252f0dd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.566111 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kk6f\" (UniqueName: \"kubernetes.io/projected/34467c19-ae33-49c1-871d-b2499252f0dd-kube-api-access-9kk6f\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.573248 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pqptl_2d3b856d-6654-4b3f-8b42-92c92e968f86/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.574076 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.578148 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6b7vp_0a149548-294b-4acf-9c86-c036a0ce0fa4/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.580933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.583260 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7khpv_8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.584671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.586528 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cmh44_12857aab-68ed-47de-9796-c247fee349a3/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.586953 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.632094 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5v6t9_e0b9f734-3994-4f64-92c4-86b71233f20a/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.633079 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667394 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content\") pod \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content\") pod \"2d3b856d-6654-4b3f-8b42-92c92e968f86\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9hg\" (UniqueName: \"kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg\") pod \"2d3b856d-6654-4b3f-8b42-92c92e968f86\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667530 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wkqb\" (UniqueName: \"kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb\") pod \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667569 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content\") pod \"0a149548-294b-4acf-9c86-c036a0ce0fa4\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities\") pod \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\" (UID: \"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667659 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mmpv\" (UniqueName: \"kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv\") pod \"0a149548-294b-4acf-9c86-c036a0ce0fa4\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities\") pod \"0a149548-294b-4acf-9c86-c036a0ce0fa4\" (UID: \"0a149548-294b-4acf-9c86-c036a0ce0fa4\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667729 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86wdc\" (UniqueName: \"kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc\") pod \"12857aab-68ed-47de-9796-c247fee349a3\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities\") pod \"12857aab-68ed-47de-9796-c247fee349a3\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp62b\" (UniqueName: \"kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b\") pod \"e0b9f734-3994-4f64-92c4-86b71233f20a\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667803 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities\") pod \"e0b9f734-3994-4f64-92c4-86b71233f20a\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content\") pod \"e0b9f734-3994-4f64-92c4-86b71233f20a\" (UID: \"e0b9f734-3994-4f64-92c4-86b71233f20a\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667866 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities\") pod \"2d3b856d-6654-4b3f-8b42-92c92e968f86\" (UID: \"2d3b856d-6654-4b3f-8b42-92c92e968f86\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.667921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content\") pod \"12857aab-68ed-47de-9796-c247fee349a3\" (UID: \"12857aab-68ed-47de-9796-c247fee349a3\") " Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.674195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities" (OuterVolumeSpecName: "utilities") pod "12857aab-68ed-47de-9796-c247fee349a3" (UID: "12857aab-68ed-47de-9796-c247fee349a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.675575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities" (OuterVolumeSpecName: "utilities") pod "e0b9f734-3994-4f64-92c4-86b71233f20a" (UID: "e0b9f734-3994-4f64-92c4-86b71233f20a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.677210 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg" (OuterVolumeSpecName: "kube-api-access-ls9hg") pod "2d3b856d-6654-4b3f-8b42-92c92e968f86" (UID: "2d3b856d-6654-4b3f-8b42-92c92e968f86"). InnerVolumeSpecName "kube-api-access-ls9hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.680169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b" (OuterVolumeSpecName: "kube-api-access-mp62b") pod "e0b9f734-3994-4f64-92c4-86b71233f20a" (UID: "e0b9f734-3994-4f64-92c4-86b71233f20a"). InnerVolumeSpecName "kube-api-access-mp62b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.682777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities" (OuterVolumeSpecName: "utilities") pod "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" (UID: "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.689296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb" (OuterVolumeSpecName: "kube-api-access-8wkqb") pod "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" (UID: "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1"). InnerVolumeSpecName "kube-api-access-8wkqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.690505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d3b856d-6654-4b3f-8b42-92c92e968f86" (UID: "2d3b856d-6654-4b3f-8b42-92c92e968f86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.691470 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities" (OuterVolumeSpecName: "utilities") pod "0a149548-294b-4acf-9c86-c036a0ce0fa4" (UID: "0a149548-294b-4acf-9c86-c036a0ce0fa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.691711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities" (OuterVolumeSpecName: "utilities") pod "2d3b856d-6654-4b3f-8b42-92c92e968f86" (UID: "2d3b856d-6654-4b3f-8b42-92c92e968f86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.693698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv" (OuterVolumeSpecName: "kube-api-access-2mmpv") pod "0a149548-294b-4acf-9c86-c036a0ce0fa4" (UID: "0a149548-294b-4acf-9c86-c036a0ce0fa4"). InnerVolumeSpecName "kube-api-access-2mmpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.694735 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc" (OuterVolumeSpecName: "kube-api-access-86wdc") pod "12857aab-68ed-47de-9796-c247fee349a3" (UID: "12857aab-68ed-47de-9796-c247fee349a3"). InnerVolumeSpecName "kube-api-access-86wdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.708517 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a149548-294b-4acf-9c86-c036a0ce0fa4" (UID: "0a149548-294b-4acf-9c86-c036a0ce0fa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713057 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6b7vp_0a149548-294b-4acf-9c86-c036a0ce0fa4/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713379 4795 generic.go:334] "Generic (PLEG): container finished" podID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerID="314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7" exitCode=2 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerDied","Data":"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6b7vp" event={"ID":"0a149548-294b-4acf-9c86-c036a0ce0fa4","Type":"ContainerDied","Data":"fa4a1d58753cbaee96412258304543c657b5ba86262da1c33b7e60f4ea0ad81d"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713518 4795 scope.go:117] "RemoveContainer" containerID="314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.713789 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6b7vp" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.715437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0b9f734-3994-4f64-92c4-86b71233f20a" (UID: "e0b9f734-3994-4f64-92c4-86b71233f20a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.725081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" event={"ID":"5e854247-6adf-4f84-96b8-083f8772d8eb","Type":"ContainerStarted","Data":"0ce798f407f85cf5b4888b0e75235492564838ca965f82b8101727d9cc804fe7"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.725562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" event={"ID":"5e854247-6adf-4f84-96b8-083f8772d8eb","Type":"ContainerStarted","Data":"59595c9393216510f476d17334aad312c40f46a106921bf46f0a8b1ac96d4407"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.726649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.730505 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nz6vn container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.730897 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" podUID="5e854247-6adf-4f84-96b8-083f8772d8eb" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.735542 4795 scope.go:117] "RemoveContainer" containerID="d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.749012 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hx4v9_34467c19-ae33-49c1-871d-b2499252f0dd/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.749824 4795 generic.go:334] "Generic (PLEG): container finished" podID="34467c19-ae33-49c1-871d-b2499252f0dd" containerID="0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf" exitCode=2 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.750009 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hx4v9" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.755225 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cmh44_12857aab-68ed-47de-9796-c247fee349a3/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.757062 4795 generic.go:334] "Generic (PLEG): container finished" podID="12857aab-68ed-47de-9796-c247fee349a3" containerID="af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd" exitCode=2 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.757555 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cmh44" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.763423 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5v6t9_e0b9f734-3994-4f64-92c4-86b71233f20a/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.765097 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" path="/var/lib/kubelet/pods/0849cac0-adb5-41b4-a67a-3f7dc195e78a/volumes" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.765487 4795 generic.go:334] "Generic (PLEG): container finished" podID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerID="c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245" exitCode=1 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.765881 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5v6t9" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.770052 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" path="/var/lib/kubelet/pods/0e74a7c4-6d12-4252-9d93-e8950c9a7e46/volumes" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.775473 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" podStartSLOduration=2.775452316 podStartE2EDuration="2.775452316s" podCreationTimestamp="2025-12-05 08:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:28:10.746164276 +0000 UTC m=+242.318768015" watchObservedRunningTime="2025-12-05 08:28:10.775452316 +0000 UTC m=+242.348056055" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.777009 4795 scope.go:117] "RemoveContainer" containerID="314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.777729 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2s2bq_22f937a2-2821-4972-b16e-a266a8a3a837/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.777698 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7\": container with ID starting with 314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7 not found: ID does not exist" containerID="314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.778110 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7"} err="failed to get container status \"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7\": rpc error: code = NotFound desc = could not find container \"314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7\": container with ID starting with 314142ab84f16bd9a371fa7260a44280172d8836008834c45f9684d8689238a7 not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.778230 4795 scope.go:117] "RemoveContainer" containerID="d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40" Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.780104 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40\": container with ID starting with d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40 not found: ID does not exist" containerID="d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.780205 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40"} err="failed to get container status \"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40\": rpc error: code = NotFound desc = could not find container \"d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40\": container with ID starting with d8405123f321705c49162686bde44089cd64faefdd44ed5c39b69e5f0ac07c40 not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.781822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2s2bq" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.799421 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7khpv_8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1/extract-content/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.803686 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerID="45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2" exitCode=2 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.804994 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7khpv" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.810277 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820002 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9hg\" (UniqueName: \"kubernetes.io/projected/2d3b856d-6654-4b3f-8b42-92c92e968f86-kube-api-access-ls9hg\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820016 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wkqb\" (UniqueName: \"kubernetes.io/projected/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-kube-api-access-8wkqb\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820027 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820036 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820046 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mmpv\" (UniqueName: \"kubernetes.io/projected/0a149548-294b-4acf-9c86-c036a0ce0fa4-kube-api-access-2mmpv\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820055 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a149548-294b-4acf-9c86-c036a0ce0fa4-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820066 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86wdc\" (UniqueName: \"kubernetes.io/projected/12857aab-68ed-47de-9796-c247fee349a3-kube-api-access-86wdc\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820075 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820085 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp62b\" (UniqueName: \"kubernetes.io/projected/e0b9f734-3994-4f64-92c4-86b71233f20a-kube-api-access-mp62b\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820093 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820102 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0b9f734-3994-4f64-92c4-86b71233f20a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820110 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d3b856d-6654-4b3f-8b42-92c92e968f86-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.814086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx4v9" event={"ID":"34467c19-ae33-49c1-871d-b2499252f0dd","Type":"ContainerDied","Data":"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820158 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hx4v9" event={"ID":"34467c19-ae33-49c1-871d-b2499252f0dd","Type":"ContainerDied","Data":"399874e26ae2dcaaa6316d6068037f74a2ce9ecbff96bb5c8c46fabca17bb902"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820192 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmh44" event={"ID":"12857aab-68ed-47de-9796-c247fee349a3","Type":"ContainerDied","Data":"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cmh44" event={"ID":"12857aab-68ed-47de-9796-c247fee349a3","Type":"ContainerDied","Data":"926f965bcdd7a62547e3808682dafe23ffdd3a883e123d06d1ab089fb3d12858"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820217 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6b7vp"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820236 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerDied","Data":"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5v6t9" event={"ID":"e0b9f734-3994-4f64-92c4-86b71233f20a","Type":"ContainerDied","Data":"76209e12dbd8d75ea3792d8b13d7a83670afc93ae322649f21f858466a2e7d0b"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2s2bq" event={"ID":"22f937a2-2821-4972-b16e-a266a8a3a837","Type":"ContainerDied","Data":"230f9391e9659d5627a33a2e4388626c108cbfef0f89d26d5ea04eb34081cb14"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7khpv" event={"ID":"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1","Type":"ContainerDied","Data":"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820289 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7khpv" event={"ID":"8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1","Type":"ContainerDied","Data":"be2f2b9c09e9f723e152f397f1aeb53735b83880628a739f3bc7df4c7252fe11"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.820310 4795 scope.go:117] "RemoveContainer" containerID="0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.826485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pqptl_2d3b856d-6654-4b3f-8b42-92c92e968f86/registry-server/0.log" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.829042 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerID="e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce" exitCode=1 Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.831755 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqptl" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.835264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerDied","Data":"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.835335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqptl" event={"ID":"2d3b856d-6654-4b3f-8b42-92c92e968f86","Type":"ContainerDied","Data":"927a540000ad56769f6d17fb9f0fc6388f498a02a87e03d9d11f3b1cbf4f3e44"} Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.854008 4795 scope.go:117] "RemoveContainer" containerID="c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.867715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" (UID: "8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.886354 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.894250 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hx4v9"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.898309 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.901529 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5v6t9"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.901635 4795 scope.go:117] "RemoveContainer" containerID="0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf" Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.902123 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf\": container with ID starting with 0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf not found: ID does not exist" containerID="0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.902173 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf"} err="failed to get container status \"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf\": rpc error: code = NotFound desc = could not find container \"0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf\": container with ID starting with 0856456d12d110c2ee6c98343ff5369908ccb45d31a352efee3a460885c570bf not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.902192 4795 scope.go:117] "RemoveContainer" containerID="c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.908765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.909788 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373\": container with ID starting with c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373 not found: ID does not exist" containerID="c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.909846 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373"} err="failed to get container status \"c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373\": rpc error: code = NotFound desc = could not find container \"c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373\": container with ID starting with c2acbe457a35bd7240c0a804c2dc1212ab81d0aa2cfb1e37e4b555124646c373 not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.909883 4795 scope.go:117] "RemoveContainer" containerID="af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.914109 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12857aab-68ed-47de-9796-c247fee349a3" (UID: "12857aab-68ed-47de-9796-c247fee349a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.918314 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqptl"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.922940 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.924948 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12857aab-68ed-47de-9796-c247fee349a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.924979 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.925764 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2s2bq"] Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.929807 4795 scope.go:117] "RemoveContainer" containerID="eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.962412 4795 scope.go:117] "RemoveContainer" containerID="af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd" Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.963567 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd\": container with ID starting with af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd not found: ID does not exist" containerID="af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.963684 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd"} err="failed to get container status \"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd\": rpc error: code = NotFound desc = could not find container \"af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd\": container with ID starting with af0d067b190808fa34c5d2c089bdd2d2b6335aedcbcdcf372d3dd35313ae5bbd not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.963754 4795 scope.go:117] "RemoveContainer" containerID="eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0" Dec 05 08:28:10 crc kubenswrapper[4795]: E1205 08:28:10.966045 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0\": container with ID starting with eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0 not found: ID does not exist" containerID="eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.966141 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0"} err="failed to get container status \"eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0\": rpc error: code = NotFound desc = could not find container \"eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0\": container with ID starting with eeeb35520e4d311fa05d18eae6266ed92b1f76790d55db827893a96a679152e0 not found: ID does not exist" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.966220 4795 scope.go:117] "RemoveContainer" containerID="c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245" Dec 05 08:28:10 crc kubenswrapper[4795]: I1205 08:28:10.992790 4795 scope.go:117] "RemoveContainer" containerID="6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.015497 4795 scope.go:117] "RemoveContainer" containerID="11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.038501 4795 scope.go:117] "RemoveContainer" containerID="c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.039176 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245\": container with ID starting with c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245 not found: ID does not exist" containerID="c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.039212 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245"} err="failed to get container status \"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245\": rpc error: code = NotFound desc = could not find container \"c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245\": container with ID starting with c144bbbfe16ca958ef8500511a658c0153aca0a6cd8c5db475f666bc2e063245 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.039242 4795 scope.go:117] "RemoveContainer" containerID="6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.039919 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe\": container with ID starting with 6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe not found: ID does not exist" containerID="6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.039945 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe"} err="failed to get container status \"6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe\": rpc error: code = NotFound desc = could not find container \"6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe\": container with ID starting with 6fa3a365103a69f19cf52728ba00ec5d5c1518a8740e7e6007e819bcccfc6cfe not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.039960 4795 scope.go:117] "RemoveContainer" containerID="11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.041399 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3\": container with ID starting with 11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3 not found: ID does not exist" containerID="11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.041420 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3"} err="failed to get container status \"11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3\": rpc error: code = NotFound desc = could not find container \"11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3\": container with ID starting with 11e458571d545bc7bc0d9a0d7d21869a1fb8874ae9ca8117ff063c953f3de1f3 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.041434 4795 scope.go:117] "RemoveContainer" containerID="7cdbf6a797cb12d6ddaf8dcc3c7544ba911c5629dc0abb41342c124db4b7846a" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.058127 4795 scope.go:117] "RemoveContainer" containerID="33fb7f0f7da29f49c9d899994d484528aa10df41ec5916c289fc753a6dc8b848" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.085464 4795 scope.go:117] "RemoveContainer" containerID="09be6c20c5d35d94c82939750df346feb2ef4a9ba070b26f647784ddfa41f3ec" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.102996 4795 scope.go:117] "RemoveContainer" containerID="45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.122764 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.124226 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cmh44"] Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.142418 4795 scope.go:117] "RemoveContainer" containerID="9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.169919 4795 scope.go:117] "RemoveContainer" containerID="45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.170510 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2\": container with ID starting with 45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2 not found: ID does not exist" containerID="45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.170579 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2"} err="failed to get container status \"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2\": rpc error: code = NotFound desc = could not find container \"45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2\": container with ID starting with 45ab28aea9cad2a4394cdda612215c11b5dac512b02c3785d96c93f367eca2f2 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.170605 4795 scope.go:117] "RemoveContainer" containerID="9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.171009 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52\": container with ID starting with 9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52 not found: ID does not exist" containerID="9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.171036 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52"} err="failed to get container status \"9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52\": rpc error: code = NotFound desc = could not find container \"9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52\": container with ID starting with 9ae96083e56dd9bdfc6e9e9817485be0eb83946b794f6a0bdd27d16fe66a9d52 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.171054 4795 scope.go:117] "RemoveContainer" containerID="e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.195786 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.203876 4795 scope.go:117] "RemoveContainer" containerID="a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.208157 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7khpv"] Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.233074 4795 scope.go:117] "RemoveContainer" containerID="b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.252597 4795 scope.go:117] "RemoveContainer" containerID="e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.253847 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce\": container with ID starting with e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce not found: ID does not exist" containerID="e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.253927 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce"} err="failed to get container status \"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce\": rpc error: code = NotFound desc = could not find container \"e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce\": container with ID starting with e194438e97ac875fdb99b545b7fe73263fa7e27089243deb3c6f2869d1a737ce not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.253979 4795 scope.go:117] "RemoveContainer" containerID="a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.254540 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9\": container with ID starting with a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9 not found: ID does not exist" containerID="a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.254567 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9"} err="failed to get container status \"a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9\": rpc error: code = NotFound desc = could not find container \"a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9\": container with ID starting with a33a57d17ae63bf8486661fd2a561a9da642f69e9420f6b137636346f2c398a9 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.254583 4795 scope.go:117] "RemoveContainer" containerID="b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.254921 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884\": container with ID starting with b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884 not found: ID does not exist" containerID="b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.255029 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884"} err="failed to get container status \"b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884\": rpc error: code = NotFound desc = could not find container \"b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884\": container with ID starting with b96d0b53b84a0c74e5f308d1a7758825ab10343805c4c7b848b36b8eaa414884 not found: ID does not exist" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.412506 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrx88"] Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.412874 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.412903 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.412917 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.412927 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.412938 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.412947 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.412968 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.412977 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.412990 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413000 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413010 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413019 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413029 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413047 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413054 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413066 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413073 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413082 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413090 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413101 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413111 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413121 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413129 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413160 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413168 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413183 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413192 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413224 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413293 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413306 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413314 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.413326 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.413336 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="extract-utilities" Dec 05 08:28:11 crc kubenswrapper[4795]: E1205 08:28:11.415535 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415772 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0849cac0-adb5-41b4-a67a-3f7dc195e78a" containerName="marketplace-operator" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415822 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415836 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e74a7c4-6d12-4252-9d93-e8950c9a7e46" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415846 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415856 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="12857aab-68ed-47de-9796-c247fee349a3" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415894 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415907 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415918 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" containerName="extract-content" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.415934 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" containerName="registry-server" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.426666 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrx88"] Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.426827 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.432066 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.546938 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4r2\" (UniqueName: \"kubernetes.io/projected/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-kube-api-access-km4r2\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.547018 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-catalog-content\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.547044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-utilities\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.648891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-catalog-content\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.649279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-utilities\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.649446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4r2\" (UniqueName: \"kubernetes.io/projected/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-kube-api-access-km4r2\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.650350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-catalog-content\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.650443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-utilities\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.677685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4r2\" (UniqueName: \"kubernetes.io/projected/56e94ad9-4c99-4fa8-bb1e-540fadd9410c-kube-api-access-km4r2\") pod \"redhat-marketplace-zrx88\" (UID: \"56e94ad9-4c99-4fa8-bb1e-540fadd9410c\") " pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.745310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.852960 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nz6vn" Dec 05 08:28:11 crc kubenswrapper[4795]: I1205 08:28:11.987253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrx88"] Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.755239 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a149548-294b-4acf-9c86-c036a0ce0fa4" path="/var/lib/kubelet/pods/0a149548-294b-4acf-9c86-c036a0ce0fa4/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.758087 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12857aab-68ed-47de-9796-c247fee349a3" path="/var/lib/kubelet/pods/12857aab-68ed-47de-9796-c247fee349a3/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.758632 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f937a2-2821-4972-b16e-a266a8a3a837" path="/var/lib/kubelet/pods/22f937a2-2821-4972-b16e-a266a8a3a837/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.759406 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d3b856d-6654-4b3f-8b42-92c92e968f86" path="/var/lib/kubelet/pods/2d3b856d-6654-4b3f-8b42-92c92e968f86/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.761516 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34467c19-ae33-49c1-871d-b2499252f0dd" path="/var/lib/kubelet/pods/34467c19-ae33-49c1-871d-b2499252f0dd/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.762164 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1" path="/var/lib/kubelet/pods/8a0cea5b-c8ea-49d2-bd1e-8e8312f454e1/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.762760 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b9f734-3994-4f64-92c4-86b71233f20a" path="/var/lib/kubelet/pods/e0b9f734-3994-4f64-92c4-86b71233f20a/volumes" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.810002 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cccsq"] Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.811320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.815262 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.856277 4795 generic.go:334] "Generic (PLEG): container finished" podID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" containerID="4e20f98ec18d05c4c7312cda35c980c6c1c189231cccab52ebd34443db448287" exitCode=0 Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.856419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrx88" event={"ID":"56e94ad9-4c99-4fa8-bb1e-540fadd9410c","Type":"ContainerDied","Data":"4e20f98ec18d05c4c7312cda35c980c6c1c189231cccab52ebd34443db448287"} Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.856472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrx88" event={"ID":"56e94ad9-4c99-4fa8-bb1e-540fadd9410c","Type":"ContainerStarted","Data":"a73c0fc920c82a87f285f484c98ba8d81dd25a6b1bd8416541ae9e5cb16163e2"} Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.865373 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cccsq"] Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.969751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-catalog-content\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.969807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-utilities\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:12 crc kubenswrapper[4795]: I1205 08:28:12.969828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbtbx\" (UniqueName: \"kubernetes.io/projected/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-kube-api-access-hbtbx\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.071433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-catalog-content\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.071508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-utilities\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.071533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbtbx\" (UniqueName: \"kubernetes.io/projected/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-kube-api-access-hbtbx\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.072470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-catalog-content\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.072722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-utilities\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.093867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbtbx\" (UniqueName: \"kubernetes.io/projected/c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e-kube-api-access-hbtbx\") pod \"redhat-operators-cccsq\" (UID: \"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e\") " pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.128667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.340071 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cccsq"] Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.808214 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.810501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.815559 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.825510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.865343 4795 generic.go:334] "Generic (PLEG): container finished" podID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" containerID="3e83d509a8a668628936990b0a6bcbf20e3beb94eeb1cbef9a551a565d452621" exitCode=0 Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.865452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cccsq" event={"ID":"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e","Type":"ContainerDied","Data":"3e83d509a8a668628936990b0a6bcbf20e3beb94eeb1cbef9a551a565d452621"} Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.866031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cccsq" event={"ID":"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e","Type":"ContainerStarted","Data":"afadaf844fc68ac7f4001b25961ca00b27f195f73159d49a9b35cd386ae6392e"} Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.893916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.894321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm7ws\" (UniqueName: \"kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.894488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.995828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.995954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.995999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm7ws\" (UniqueName: \"kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.996640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:13 crc kubenswrapper[4795]: I1205 08:28:13.997368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.021082 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm7ws\" (UniqueName: \"kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws\") pod \"certified-operators-ksjsh\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.132328 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.324073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.874094 4795 generic.go:334] "Generic (PLEG): container finished" podID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" containerID="2848fa587dff163f9d0f4fd17ba079b47164e02ac1a5957e92d2870e5a03b992" exitCode=0 Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.875594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrx88" event={"ID":"56e94ad9-4c99-4fa8-bb1e-540fadd9410c","Type":"ContainerDied","Data":"2848fa587dff163f9d0f4fd17ba079b47164e02ac1a5957e92d2870e5a03b992"} Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.881086 4795 generic.go:334] "Generic (PLEG): container finished" podID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerID="ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308" exitCode=0 Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.881157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerDied","Data":"ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308"} Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.881190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerStarted","Data":"72df7d1181a9ac84a8cd371fbedf6e0c588b9f4da746b3238891709348df8e45"} Dec 05 08:28:14 crc kubenswrapper[4795]: I1205 08:28:14.887447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cccsq" event={"ID":"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e","Type":"ContainerStarted","Data":"ba1aa6c847bd65fde1d6286e41b90f4e014e6ceba0e0d6b31926d655cbcad0e4"} Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.207516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.209229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.212690 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.226083 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.315312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnddc\" (UniqueName: \"kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.315372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.315419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.416673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnddc\" (UniqueName: \"kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.417111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.417268 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.417630 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.417748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.441108 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnddc\" (UniqueName: \"kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc\") pod \"community-operators-9b5qq\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.535398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.894446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerStarted","Data":"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351"} Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.898273 4795 generic.go:334] "Generic (PLEG): container finished" podID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" containerID="ba1aa6c847bd65fde1d6286e41b90f4e014e6ceba0e0d6b31926d655cbcad0e4" exitCode=0 Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.898325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cccsq" event={"ID":"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e","Type":"ContainerDied","Data":"ba1aa6c847bd65fde1d6286e41b90f4e014e6ceba0e0d6b31926d655cbcad0e4"} Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.902498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrx88" event={"ID":"56e94ad9-4c99-4fa8-bb1e-540fadd9410c","Type":"ContainerStarted","Data":"f5b14ee0ef806449fe6f55f0317ba800ddf8c3df302f998940dd5d306f543d50"} Dec 05 08:28:15 crc kubenswrapper[4795]: I1205 08:28:15.963872 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrx88" podStartSLOduration=2.322980727 podStartE2EDuration="4.963843212s" podCreationTimestamp="2025-12-05 08:28:11 +0000 UTC" firstStartedPulling="2025-12-05 08:28:12.857990265 +0000 UTC m=+244.430594024" lastFinishedPulling="2025-12-05 08:28:15.49885277 +0000 UTC m=+247.071456509" observedRunningTime="2025-12-05 08:28:15.962208213 +0000 UTC m=+247.534811962" watchObservedRunningTime="2025-12-05 08:28:15.963843212 +0000 UTC m=+247.536446951" Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.007305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.910951 4795 generic.go:334] "Generic (PLEG): container finished" podID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerID="c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351" exitCode=0 Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.911073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerDied","Data":"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351"} Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.916573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cccsq" event={"ID":"c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e","Type":"ContainerStarted","Data":"78cf1f58fbbf7070578362a1efb54f632f849fdbfbba383f03b8b1204d84b746"} Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.919189 4795 generic.go:334] "Generic (PLEG): container finished" podID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerID="e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc" exitCode=0 Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.920124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerDied","Data":"e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc"} Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.920280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerStarted","Data":"51fd62308d8d4eda05cfa629efa29824f1409d6ea1908e75b3f34a5f0ba2d0c7"} Dec 05 08:28:16 crc kubenswrapper[4795]: I1205 08:28:16.987823 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cccsq" podStartSLOduration=2.558953351 podStartE2EDuration="4.987796087s" podCreationTimestamp="2025-12-05 08:28:12 +0000 UTC" firstStartedPulling="2025-12-05 08:28:13.867296466 +0000 UTC m=+245.439900205" lastFinishedPulling="2025-12-05 08:28:16.296139202 +0000 UTC m=+247.868742941" observedRunningTime="2025-12-05 08:28:16.985436606 +0000 UTC m=+248.558040345" watchObservedRunningTime="2025-12-05 08:28:16.987796087 +0000 UTC m=+248.560399816" Dec 05 08:28:17 crc kubenswrapper[4795]: I1205 08:28:17.928368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerStarted","Data":"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf"} Dec 05 08:28:17 crc kubenswrapper[4795]: I1205 08:28:17.930881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerStarted","Data":"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03"} Dec 05 08:28:18 crc kubenswrapper[4795]: I1205 08:28:18.939348 4795 generic.go:334] "Generic (PLEG): container finished" podID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerID="7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf" exitCode=0 Dec 05 08:28:18 crc kubenswrapper[4795]: I1205 08:28:18.939456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerDied","Data":"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf"} Dec 05 08:28:18 crc kubenswrapper[4795]: I1205 08:28:18.966837 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ksjsh" podStartSLOduration=3.414266542 podStartE2EDuration="5.966812042s" podCreationTimestamp="2025-12-05 08:28:13 +0000 UTC" firstStartedPulling="2025-12-05 08:28:14.882716298 +0000 UTC m=+246.455320037" lastFinishedPulling="2025-12-05 08:28:17.435261798 +0000 UTC m=+249.007865537" observedRunningTime="2025-12-05 08:28:17.970688473 +0000 UTC m=+249.543292222" watchObservedRunningTime="2025-12-05 08:28:18.966812042 +0000 UTC m=+250.539415781" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.746312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.747061 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.838675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.968004 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerStarted","Data":"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16"} Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.997989 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.998874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999347 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5" gracePeriod=15 Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999430 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08" gracePeriod=15 Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999510 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5" gracePeriod=15 Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999565 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2" gracePeriod=15 Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999404 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3" gracePeriod=15 Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999756 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 08:28:21 crc kubenswrapper[4795]: E1205 08:28:21.999884 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 08:28:21 crc kubenswrapper[4795]: I1205 08:28:21.999904 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999914 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:21.999922 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999936 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:21.999943 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999950 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:21.999956 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999963 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:21.999970 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999982 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:21.999989 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:21.999997 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000004 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000095 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000107 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000116 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000128 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000138 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.000336 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.100852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrx88" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.103865 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.104252 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122230 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.122674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:22.139204 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.224934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.441142 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:22.481351 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e4464842aa201 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 08:28:22.480617985 +0000 UTC m=+254.053221714,LastTimestamp:2025-12-05 08:28:22.480617985 +0000 UTC m=+254.053221714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.975266 4795 generic.go:334] "Generic (PLEG): container finished" podID="1b992d50-5e09-444d-813a-a2c4cfa25e05" containerID="5a47b9e6bf24fb7d269c30cde35c4b7a723dc17335fab4448f2425ada5b778d3" exitCode=0 Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.975353 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1b992d50-5e09-444d-813a-a2c4cfa25e05","Type":"ContainerDied","Data":"5a47b9e6bf24fb7d269c30cde35c4b7a723dc17335fab4448f2425ada5b778d3"} Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.976207 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.976900 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.978296 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.979707 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.980511 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08" exitCode=0 Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.980541 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3" exitCode=0 Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.980551 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5" exitCode=0 Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.980559 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2" exitCode=2 Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.980590 4795 scope.go:117] "RemoveContainer" containerID="99edc53e936059d7c74a5ff476422575db855db0309a5d605b77b9fea6c3c827" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.982443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c5b924ec64bd9e684bb83afccd57c8c5a3473744eede3510a9aad5bfe735fb14"} Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.982495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bc38509817ed4e5a0ea69053c8d9994b7ee52f1850643ec321d6f4ce89fac25e"} Dec 05 08:28:22 crc kubenswrapper[4795]: E1205 08:28:22.983934 4795 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.984207 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:22 crc kubenswrapper[4795]: I1205 08:28:22.984377 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.129578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.129683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.177457 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.178094 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.178332 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.178534 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:23 crc kubenswrapper[4795]: I1205 08:28:23.993252 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.041976 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cccsq" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.042800 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.043082 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.043301 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.132985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.133393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.236853 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.237541 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.237998 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.238368 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.238861 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.451784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.452788 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.453928 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.454569 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.454957 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.560479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir\") pod \"1b992d50-5e09-444d-813a-a2c4cfa25e05\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.560582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access\") pod \"1b992d50-5e09-444d-813a-a2c4cfa25e05\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.560620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock\") pod \"1b992d50-5e09-444d-813a-a2c4cfa25e05\" (UID: \"1b992d50-5e09-444d-813a-a2c4cfa25e05\") " Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.560738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1b992d50-5e09-444d-813a-a2c4cfa25e05" (UID: "1b992d50-5e09-444d-813a-a2c4cfa25e05"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.561155 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.561221 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock" (OuterVolumeSpecName: "var-lock") pod "1b992d50-5e09-444d-813a-a2c4cfa25e05" (UID: "1b992d50-5e09-444d-813a-a2c4cfa25e05"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.572317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1b992d50-5e09-444d-813a-a2c4cfa25e05" (UID: "1b992d50-5e09-444d-813a-a2c4cfa25e05"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.662394 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1b992d50-5e09-444d-813a-a2c4cfa25e05-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:24 crc kubenswrapper[4795]: I1205 08:28:24.662431 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1b992d50-5e09-444d-813a-a2c4cfa25e05-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.004053 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1b992d50-5e09-444d-813a-a2c4cfa25e05","Type":"ContainerDied","Data":"54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150"} Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.006374 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54293734f2edd823448479706f97219b52f9ee77663087899f57fbdadb05f150" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.004130 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.011381 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.011943 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.013296 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.013727 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.052380 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.052942 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.053289 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.053650 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.053874 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: E1205 08:28:25.490752 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e4464842aa201 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-05 08:28:22.480617985 +0000 UTC m=+254.053221714,LastTimestamp:2025-12-05 08:28:22.480617985 +0000 UTC m=+254.053221714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.536264 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.536323 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.584695 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.585237 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.585568 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.586631 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.586982 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:25 crc kubenswrapper[4795]: I1205 08:28:25.587267 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.062270 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.063262 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.063794 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.064309 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.064576 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:26 crc kubenswrapper[4795]: I1205 08:28:26.064961 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.022485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.023916 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5" exitCode=0 Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.297565 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.299600 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.300227 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.300550 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.300907 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.301234 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.301479 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.301935 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407300 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407782 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.407834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.509434 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.509478 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:27 crc kubenswrapper[4795]: I1205 08:28:27.509488 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.033369 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.034454 4795 scope.go:117] "RemoveContainer" containerID="ed28a35eea9d4cd3dde1ccd56791a3e0cbbf98a73bfcbcd855eb13725fe65e08" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.034601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.051062 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.051422 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.053347 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.056781 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.059924 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.060367 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.062203 4795 scope.go:117] "RemoveContainer" containerID="a3d8ff8a0d0ee895980fa3b6fd1c57037289aa3997bacfbbb86bd97323b9f9c3" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.099967 4795 scope.go:117] "RemoveContainer" containerID="114c8b61794b1a07e399dc632e55da9fa56bb2d5667be7b7c319c168436ba5b5" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.116470 4795 scope.go:117] "RemoveContainer" containerID="7972384212cedbc9f1664d1efeadfcc1438857251dfb898f6efd8a34cda4a9a2" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.130786 4795 scope.go:117] "RemoveContainer" containerID="5dd491b9904d3f2868e399af5e19fa0ae321118a79fdda0cf5655bc7cb99efe5" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.148875 4795 scope.go:117] "RemoveContainer" containerID="3f0328a9f08fdad15cb92265e17f2931e935196f70feb4f0318fb67b941f2f44" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.749664 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.750355 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.750686 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.751057 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.751402 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.751828 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:28 crc kubenswrapper[4795]: I1205 08:28:28.755729 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.269303 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.269998 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.270530 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.270948 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.271238 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.271543 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: I1205 08:28:29.272215 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:29 crc kubenswrapper[4795]: E1205 08:28:29.372178 4795 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" volumeName="registry-storage" Dec 05 08:28:30 crc kubenswrapper[4795]: E1205 08:28:30.792658 4795 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" volumeName="registry-storage" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.914546 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.916076 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.916679 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.917330 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.918003 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:31 crc kubenswrapper[4795]: I1205 08:28:31.918054 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 05 08:28:31 crc kubenswrapper[4795]: E1205 08:28:31.918534 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Dec 05 08:28:31 crc kubenswrapper[4795]: I1205 08:28:31.921699 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" containerID="cri-o://ccfd521165f6178751f2eab9fc0e316567ab3fa16e3dec56a96962d2d68d5a26" gracePeriod=15 Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.072250 4795 generic.go:334] "Generic (PLEG): container finished" podID="41bb386f-8261-4203-a385-f2918e5f9718" containerID="ccfd521165f6178751f2eab9fc0e316567ab3fa16e3dec56a96962d2d68d5a26" exitCode=0 Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.072547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" event={"ID":"41bb386f-8261-4203-a385-f2918e5f9718","Type":"ContainerDied","Data":"ccfd521165f6178751f2eab9fc0e316567ab3fa16e3dec56a96962d2d68d5a26"} Dec 05 08:28:32 crc kubenswrapper[4795]: E1205 08:28:32.119863 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.354263 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.354875 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.355436 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.356052 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.356393 4795 status_manager.go:851] "Failed to get status for pod" podUID="41bb386f-8261-4203-a385-f2918e5f9718" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tn798\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.356909 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.357335 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.357698 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.487075 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.487552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.487702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.489472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.489691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.490234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.490407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.490706 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.490860 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frdzs\" (UniqueName: \"kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491173 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491310 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491450 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.492305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies\") pod \"41bb386f-8261-4203-a385-f2918e5f9718\" (UID: \"41bb386f-8261-4203-a385-f2918e5f9718\") " Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.490073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.491731 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.493286 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.496199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.497575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.501004 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.501391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs" (OuterVolumeSpecName: "kube-api-access-frdzs") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "kube-api-access-frdzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.501615 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.501800 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.502372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.503033 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.504252 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "41bb386f-8261-4203-a385-f2918e5f9718" (UID: "41bb386f-8261-4203-a385-f2918e5f9718"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:28:32 crc kubenswrapper[4795]: E1205 08:28:32.521026 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.595765 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.596284 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.596473 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.596724 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597016 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597179 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597304 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/41bb386f-8261-4203-a385-f2918e5f9718-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597443 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597584 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frdzs\" (UniqueName: \"kubernetes.io/projected/41bb386f-8261-4203-a385-f2918e5f9718-kube-api-access-frdzs\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597749 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597891 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.597990 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.598083 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.598165 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/41bb386f-8261-4203-a385-f2918e5f9718-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.747559 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.748569 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.749090 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.750106 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.751040 4795 status_manager.go:851] "Failed to get status for pod" podUID="41bb386f-8261-4203-a385-f2918e5f9718" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tn798\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.751705 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.751917 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.752833 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.764159 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.764219 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:32 crc kubenswrapper[4795]: E1205 08:28:32.765581 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:32 crc kubenswrapper[4795]: I1205 08:28:32.766332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.082876 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.084242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" event={"ID":"41bb386f-8261-4203-a385-f2918e5f9718","Type":"ContainerDied","Data":"a2a9e8b1be6b75caf9bdfdfd68b277b86d8aff01dbd4d665dcd183ab8bc2b596"} Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.084362 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.084378 4795 scope.go:117] "RemoveContainer" containerID="ccfd521165f6178751f2eab9fc0e316567ab3fa16e3dec56a96962d2d68d5a26" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.085124 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.086506 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.086899 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.087245 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.087650 4795 status_manager.go:851] "Failed to get status for pod" podUID="41bb386f-8261-4203-a385-f2918e5f9718" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tn798\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.088166 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.088261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"79170c74b5586ce3a470d3ae3db9f2c0b74d25d9dd15788ffc592ce906864112"} Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.089265 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.089878 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.090090 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.090279 4795 status_manager.go:851] "Failed to get status for pod" podUID="41bb386f-8261-4203-a385-f2918e5f9718" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tn798\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.090453 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.090634 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: I1205 08:28:33.090799 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:33 crc kubenswrapper[4795]: E1205 08:28:33.322501 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.097003 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3f63971a224d6670aae10f4aec953e0ceb256847804dfa35ff75e48688a67f8e" exitCode=0 Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.097114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3f63971a224d6670aae10f4aec953e0ceb256847804dfa35ff75e48688a67f8e"} Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.097662 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.097709 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.098087 4795 status_manager.go:851] "Failed to get status for pod" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" pod="openshift-marketplace/community-operators-9b5qq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-9b5qq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: E1205 08:28:34.098413 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.098872 4795 status_manager.go:851] "Failed to get status for pod" podUID="c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e" pod="openshift-marketplace/redhat-operators-cccsq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cccsq\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.099172 4795 status_manager.go:851] "Failed to get status for pod" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" pod="openshift-marketplace/certified-operators-ksjsh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-ksjsh\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.099432 4795 status_manager.go:851] "Failed to get status for pod" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.099685 4795 status_manager.go:851] "Failed to get status for pod" podUID="56e94ad9-4c99-4fa8-bb1e-540fadd9410c" pod="openshift-marketplace/redhat-marketplace-zrx88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-zrx88\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.100294 4795 status_manager.go:851] "Failed to get status for pod" podUID="41bb386f-8261-4203-a385-f2918e5f9718" pod="openshift-authentication/oauth-openshift-558db77b4-tn798" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-tn798\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:34 crc kubenswrapper[4795]: I1205 08:28:34.101015 4795 status_manager.go:851] "Failed to get status for pod" podUID="91b331a6-6a6f-443d-a101-86e642f45659" pod="openshift-image-registry/image-registry-66df7c8f76-sb6fd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-registry-66df7c8f76-sb6fd\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 05 08:28:35 crc kubenswrapper[4795]: I1205 08:28:35.107347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2abb63609d5386fb3cb9c086ecd6dd288ae9c12394f0916948e0f9b29a2a9b24"} Dec 05 08:28:35 crc kubenswrapper[4795]: I1205 08:28:35.107919 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3d637b1b6c48b91bbcc5b7bba31a34ceac9ebee739fd23a1b09e8e0765d01576"} Dec 05 08:28:35 crc kubenswrapper[4795]: I1205 08:28:35.107936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f528b365a0a5f4164e3fda8b12f6f2b19b03714aac3d18a82dc0bb05e6a8a97"} Dec 05 08:28:35 crc kubenswrapper[4795]: I1205 08:28:35.107946 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"36c255362b4b65b2e7451dcfafb1d1e2224ab2cc3033769a700b63c646fa7acf"} Dec 05 08:28:36 crc kubenswrapper[4795]: I1205 08:28:36.133120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5edb7c1f238d00d3a34fc22a774b6bdadebbf3f62d3d01b1c7f57c2bd14310b5"} Dec 05 08:28:36 crc kubenswrapper[4795]: I1205 08:28:36.133560 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:36 crc kubenswrapper[4795]: I1205 08:28:36.133604 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:36 crc kubenswrapper[4795]: I1205 08:28:36.134142 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.766586 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.766988 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.771275 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.771353 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.772392 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]log ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]etcd ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/generic-apiserver-start-informers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/priority-and-fairness-filter ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-apiextensions-informers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-apiextensions-controllers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/crd-informer-synced ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-system-namespaces-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/bootstrap-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/start-kube-aggregator-informers ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-registration-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-discovery-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]autoregister-completion ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-openapi-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 05 08:28:37 crc kubenswrapper[4795]: livez check failed Dec 05 08:28:37 crc kubenswrapper[4795]: I1205 08:28:37.772466 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:28:38 crc kubenswrapper[4795]: I1205 08:28:38.152341 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 08:28:38 crc kubenswrapper[4795]: I1205 08:28:38.152408 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0" exitCode=1 Dec 05 08:28:38 crc kubenswrapper[4795]: I1205 08:28:38.152450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0"} Dec 05 08:28:38 crc kubenswrapper[4795]: I1205 08:28:38.153026 4795 scope.go:117] "RemoveContainer" containerID="3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0" Dec 05 08:28:39 crc kubenswrapper[4795]: I1205 08:28:39.163091 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 08:28:39 crc kubenswrapper[4795]: I1205 08:28:39.163487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3b426ebde8fa953cdec7463d3b37afa74ead9e40176f03f73484ab4b183926ed"} Dec 05 08:28:41 crc kubenswrapper[4795]: I1205 08:28:41.472990 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:41 crc kubenswrapper[4795]: I1205 08:28:41.789686 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bacc2f2-a972-4259-bf12-b873a1a10fd6" Dec 05 08:28:42 crc kubenswrapper[4795]: I1205 08:28:42.183907 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:42 crc kubenswrapper[4795]: I1205 08:28:42.183955 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="42252db7-6e43-427a-9257-3516071eb545" Dec 05 08:28:42 crc kubenswrapper[4795]: I1205 08:28:42.190408 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bacc2f2-a972-4259-bf12-b873a1a10fd6" Dec 05 08:28:45 crc kubenswrapper[4795]: I1205 08:28:45.868207 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:28:47 crc kubenswrapper[4795]: I1205 08:28:47.727434 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:28:47 crc kubenswrapper[4795]: I1205 08:28:47.727764 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 08:28:47 crc kubenswrapper[4795]: I1205 08:28:47.727982 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 08:28:47 crc kubenswrapper[4795]: I1205 08:28:47.905960 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 05 08:28:47 crc kubenswrapper[4795]: I1205 08:28:47.916021 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 05 08:28:48 crc kubenswrapper[4795]: I1205 08:28:48.232578 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 05 08:28:48 crc kubenswrapper[4795]: I1205 08:28:48.998178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 05 08:28:49 crc kubenswrapper[4795]: I1205 08:28:49.118964 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 05 08:28:49 crc kubenswrapper[4795]: I1205 08:28:49.287675 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 05 08:28:49 crc kubenswrapper[4795]: I1205 08:28:49.992767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 05 08:28:51 crc kubenswrapper[4795]: I1205 08:28:51.090941 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 05 08:28:51 crc kubenswrapper[4795]: I1205 08:28:51.908212 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 05 08:28:52 crc kubenswrapper[4795]: I1205 08:28:52.710557 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 05 08:28:52 crc kubenswrapper[4795]: I1205 08:28:52.816166 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.219003 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.272637 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.321111 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.334549 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.459841 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.501718 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.808307 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.814534 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.867516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 05 08:28:53 crc kubenswrapper[4795]: I1205 08:28:53.935409 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.290517 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.388848 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.552948 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.704507 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.808485 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.846244 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.861813 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 05 08:28:54 crc kubenswrapper[4795]: I1205 08:28:54.969408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.027448 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.062962 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.649009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.722464 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.825436 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 05 08:28:55 crc kubenswrapper[4795]: I1205 08:28:55.867113 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.104410 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.305381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.345697 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.348082 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.531448 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.655777 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.817363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.946478 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.959477 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9b5qq" podStartSLOduration=37.645677317 podStartE2EDuration="41.959442863s" podCreationTimestamp="2025-12-05 08:28:15 +0000 UTC" firstStartedPulling="2025-12-05 08:28:16.921113726 +0000 UTC m=+248.493717465" lastFinishedPulling="2025-12-05 08:28:21.234879272 +0000 UTC m=+252.807483011" observedRunningTime="2025-12-05 08:28:21.990218889 +0000 UTC m=+253.562822628" watchObservedRunningTime="2025-12-05 08:28:56.959442863 +0000 UTC m=+288.532046602" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.962333 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tn798","openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.962471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.970247 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.985559 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.985534758 podStartE2EDuration="15.985534758s" podCreationTimestamp="2025-12-05 08:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:28:56.983765684 +0000 UTC m=+288.556369453" watchObservedRunningTime="2025-12-05 08:28:56.985534758 +0000 UTC m=+288.558138497" Dec 05 08:28:56 crc kubenswrapper[4795]: I1205 08:28:56.987398 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.129415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.251758 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.299787 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.402142 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.581498 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.728567 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.728713 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.773948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.827406 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.943049 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 08:28:57 crc kubenswrapper[4795]: I1205 08:28:57.944420 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.066973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.247887 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.283806 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.369826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.371776 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.384301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.554405 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.604022 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.629837 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.659402 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.702760 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.756665 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41bb386f-8261-4203-a385-f2918e5f9718" path="/var/lib/kubelet/pods/41bb386f-8261-4203-a385-f2918e5f9718/volumes" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.769833 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 05 08:28:58 crc kubenswrapper[4795]: I1205 08:28:58.847938 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.182837 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.210387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.309914 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.341107 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.341215 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.433084 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.490471 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.557916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.564541 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.594558 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.632435 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.666147 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.694087 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.809962 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.815509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 05 08:28:59 crc kubenswrapper[4795]: I1205 08:28:59.876447 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.259815 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.263167 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.333715 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.365741 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.427030 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.535648 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.594592 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.674374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.780686 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.857718 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 05 08:29:00 crc kubenswrapper[4795]: I1205 08:29:00.939756 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.111626 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.126323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.126348 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.161530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.192303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.243521 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.386943 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.419829 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.541113 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.606848 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.619143 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.708518 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.721545 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.775675 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.806241 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 05 08:29:01 crc kubenswrapper[4795]: I1205 08:29:01.973942 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.023032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.106945 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.118979 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.138412 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.191572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.277348 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.318289 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.324472 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.377741 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.468868 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.478303 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.542776 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.557068 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.573587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.595681 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.687072 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.748040 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.749332 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.829718 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.857574 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.858214 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.879601 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 05 08:29:02 crc kubenswrapper[4795]: I1205 08:29:02.887819 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.003985 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.004379 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c5b924ec64bd9e684bb83afccd57c8c5a3473744eede3510a9aad5bfe735fb14" gracePeriod=5 Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.024703 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.105957 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.107491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.168595 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.200536 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.244690 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.261387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.538801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.545043 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.629486 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.774753 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.806712 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.826274 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.826410 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.852843 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.910564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.979296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 05 08:29:03 crc kubenswrapper[4795]: I1205 08:29:03.998654 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.057646 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.232302 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.303150 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.306127 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.365892 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.374477 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.415544 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.416000 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.424171 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.466756 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.510315 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.527517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.551692 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.564627 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.570871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.648119 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.691091 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.754239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.759914 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.848467 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.872207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.895385 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.914516 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.946535 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 05 08:29:04 crc kubenswrapper[4795]: I1205 08:29:04.985796 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.039986 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.092494 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.116787 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.177394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.216876 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.219389 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.226572 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.379886 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.412714 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.587470 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.667127 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.677019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.690560 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.702628 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.899839 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.910841 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.928050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.977660 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 05 08:29:05 crc kubenswrapper[4795]: I1205 08:29:05.982777 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.022061 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.105575 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.247864 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.295016 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.333501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.410796 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.497451 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.512541 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.653367 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.699682 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.772630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.869504 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.963186 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.975169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 05 08:29:06 crc kubenswrapper[4795]: I1205 08:29:06.985927 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.067992 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.096972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.144882 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.145800 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.152538 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.203134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.203764 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.353094 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.394599 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.441272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.477830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.677041 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.728152 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.728236 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.728323 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.729315 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"3b426ebde8fa953cdec7463d3b37afa74ead9e40176f03f73484ab4b183926ed"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.729488 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://3b426ebde8fa953cdec7463d3b37afa74ead9e40176f03f73484ab4b183926ed" gracePeriod=30 Dec 05 08:29:07 crc kubenswrapper[4795]: I1205 08:29:07.956050 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.017286 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.215820 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.339156 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.417308 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.417318 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.444586 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.444665 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c5b924ec64bd9e684bb83afccd57c8c5a3473744eede3510a9aad5bfe735fb14" exitCode=137 Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.604576 4795 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.609980 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.610088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.678907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.678962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679132 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679728 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679756 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679774 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.679791 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.692003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.692295 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.725400 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.761726 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.781287 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.820245 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837152 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-vgjjb"] Dec 05 08:29:08 crc kubenswrapper[4795]: E1205 08:29:08.837410 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837423 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" Dec 05 08:29:08 crc kubenswrapper[4795]: E1205 08:29:08.837443 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" containerName="installer" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837451 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" containerName="installer" Dec 05 08:29:08 crc kubenswrapper[4795]: E1205 08:29:08.837470 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837475 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837563 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b992d50-5e09-444d-813a-a2c4cfa25e05" containerName="installer" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837578 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.837585 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="41bb386f-8261-4203-a385-f2918e5f9718" containerName="oauth-openshift" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.838077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.840591 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.841771 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.843491 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.844859 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.845539 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.845826 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.847795 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.847924 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.848388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.848509 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.848590 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.850933 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.858002 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.860149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.860250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-vgjjb"] Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.883813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.883889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-dir\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.883928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsssv\" (UniqueName: \"kubernetes.io/projected/ee90e778-4502-4d66-bbd2-1cb2176b5015-kube-api-access-wsssv\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-policies\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884474 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.884535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.885760 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.978716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985879 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.985946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986002 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-dir\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986077 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsssv\" (UniqueName: \"kubernetes.io/projected/ee90e778-4502-4d66-bbd2-1cb2176b5015-kube-api-access-wsssv\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-policies\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-dir\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.986861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.987002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-audit-policies\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.987513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.991288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.991329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.991439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.991683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.992037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.996631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:08 crc kubenswrapper[4795]: I1205 08:29:08.998271 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.000644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ee90e778-4502-4d66-bbd2-1cb2176b5015-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.004813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsssv\" (UniqueName: \"kubernetes.io/projected/ee90e778-4502-4d66-bbd2-1cb2176b5015-kube-api-access-wsssv\") pod \"oauth-openshift-6869cbc5df-vgjjb\" (UID: \"ee90e778-4502-4d66-bbd2-1cb2176b5015\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.159123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.425151 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-vgjjb"] Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.449180 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.452745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" event={"ID":"ee90e778-4502-4d66-bbd2-1cb2176b5015","Type":"ContainerStarted","Data":"11c03cb00c2cbd41eb47bbbdd7716d6f993d3e743fa0e43cbd24ba0d6d2f325a"} Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.454102 4795 scope.go:117] "RemoveContainer" containerID="c5b924ec64bd9e684bb83afccd57c8c5a3473744eede3510a9aad5bfe735fb14" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.454140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.687981 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.753480 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.854526 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 05 08:29:09 crc kubenswrapper[4795]: I1205 08:29:09.877474 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.191740 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.462634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" event={"ID":"ee90e778-4502-4d66-bbd2-1cb2176b5015","Type":"ContainerStarted","Data":"c7326e604a17713eef8ddb2adb2d09fed6f82dabc989b3bba3b00001363ae134"} Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.463576 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.476831 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.496248 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6869cbc5df-vgjjb" podStartSLOduration=64.496225357 podStartE2EDuration="1m4.496225357s" podCreationTimestamp="2025-12-05 08:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:29:10.495955739 +0000 UTC m=+302.068559498" watchObservedRunningTime="2025-12-05 08:29:10.496225357 +0000 UTC m=+302.068829096" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.635714 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 05 08:29:10 crc kubenswrapper[4795]: I1205 08:29:10.793631 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.299265 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.300347 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" containerName="controller-manager" containerID="cri-o://a5ffdf754d494e56c98f12d75b2b69f03ff93dc68dc320d72b38bf8cd67db986" gracePeriod=30 Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.337350 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.364294 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.364870 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" podUID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" containerName="route-controller-manager" containerID="cri-o://49ff7c92330d91d2ce189ef283662c4a6ea7b7cbfe40705fdd5556a3f97db13d" gracePeriod=30 Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.631819 4795 generic.go:334] "Generic (PLEG): container finished" podID="a5628405-485f-42a4-ba10-db97a6df14b5" containerID="a5ffdf754d494e56c98f12d75b2b69f03ff93dc68dc320d72b38bf8cd67db986" exitCode=0 Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.631911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" event={"ID":"a5628405-485f-42a4-ba10-db97a6df14b5","Type":"ContainerDied","Data":"a5ffdf754d494e56c98f12d75b2b69f03ff93dc68dc320d72b38bf8cd67db986"} Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.649243 4795 generic.go:334] "Generic (PLEG): container finished" podID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" containerID="49ff7c92330d91d2ce189ef283662c4a6ea7b7cbfe40705fdd5556a3f97db13d" exitCode=0 Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.649299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" event={"ID":"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae","Type":"ContainerDied","Data":"49ff7c92330d91d2ce189ef283662c4a6ea7b7cbfe40705fdd5556a3f97db13d"} Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.842427 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.896348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles\") pod \"a5628405-485f-42a4-ba10-db97a6df14b5\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.896934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config\") pod \"a5628405-485f-42a4-ba10-db97a6df14b5\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.896970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert\") pod \"a5628405-485f-42a4-ba10-db97a6df14b5\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.896998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrv2k\" (UniqueName: \"kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k\") pod \"a5628405-485f-42a4-ba10-db97a6df14b5\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.897092 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca\") pod \"a5628405-485f-42a4-ba10-db97a6df14b5\" (UID: \"a5628405-485f-42a4-ba10-db97a6df14b5\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.897190 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a5628405-485f-42a4-ba10-db97a6df14b5" (UID: "a5628405-485f-42a4-ba10-db97a6df14b5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.897435 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.897862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca" (OuterVolumeSpecName: "client-ca") pod "a5628405-485f-42a4-ba10-db97a6df14b5" (UID: "a5628405-485f-42a4-ba10-db97a6df14b5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.898318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config" (OuterVolumeSpecName: "config") pod "a5628405-485f-42a4-ba10-db97a6df14b5" (UID: "a5628405-485f-42a4-ba10-db97a6df14b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.910398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a5628405-485f-42a4-ba10-db97a6df14b5" (UID: "a5628405-485f-42a4-ba10-db97a6df14b5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.911270 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k" (OuterVolumeSpecName: "kube-api-access-wrv2k") pod "a5628405-485f-42a4-ba10-db97a6df14b5" (UID: "a5628405-485f-42a4-ba10-db97a6df14b5"). InnerVolumeSpecName "kube-api-access-wrv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.926730 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:29:31 crc kubenswrapper[4795]: E1205 08:29:31.926993 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" containerName="controller-manager" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.927006 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" containerName="controller-manager" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.927148 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" containerName="controller-manager" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.927582 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.937583 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.949447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998416 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert\") pod \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7l56\" (UniqueName: \"kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56\") pod \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998532 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca\") pod \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config\") pod \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\" (UID: \"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae\") " Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-758cw\" (UniqueName: \"kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998970 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998983 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5628405-485f-42a4-ba10-db97a6df14b5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.998992 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrv2k\" (UniqueName: \"kubernetes.io/projected/a5628405-485f-42a4-ba10-db97a6df14b5-kube-api-access-wrv2k\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:31 crc kubenswrapper[4795]: I1205 08:29:31.999003 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5628405-485f-42a4-ba10-db97a6df14b5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:31.999997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca" (OuterVolumeSpecName: "client-ca") pod "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" (UID: "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.000094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config" (OuterVolumeSpecName: "config") pod "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" (UID: "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.003677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56" (OuterVolumeSpecName: "kube-api-access-p7l56") pod "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" (UID: "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae"). InnerVolumeSpecName "kube-api-access-p7l56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.004777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" (UID: "e3b1aeac-562a-4fbe-8b76-3d3143aaeeae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.043757 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:32 crc kubenswrapper[4795]: E1205 08:29:32.044414 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" containerName="route-controller-manager" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.044439 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" containerName="route-controller-manager" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.044767 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" containerName="route-controller-manager" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.051294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.065548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.099954 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9zh\" (UniqueName: \"kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100543 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-758cw\" (UniqueName: \"kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.100947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.101049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.101168 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7l56\" (UniqueName: \"kubernetes.io/projected/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-kube-api-access-p7l56\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.101256 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.102539 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.102674 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.101897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.102505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.103247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.105827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.120816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-758cw\" (UniqueName: \"kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw\") pod \"controller-manager-6c97bdb98c-pqvqh\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.203756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.204145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.204260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9zh\" (UniqueName: \"kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.204371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.205027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.205546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.208584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.221429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9zh\" (UniqueName: \"kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh\") pod \"route-controller-manager-5cb4468b7-n5tjj\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.251268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.373653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.507998 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.674233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" event={"ID":"752496ab-e447-4601-8c9e-5c4993ffde4a","Type":"ContainerStarted","Data":"87f036f2845f065d6354c1fa1d9856a076a5ee105736b408bb80ec66ee4967a4"} Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.678156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" event={"ID":"a5628405-485f-42a4-ba10-db97a6df14b5","Type":"ContainerDied","Data":"459ccda15d82f50dac0c57abe2a688eedefaeedde67e373ca53a90ff21b79c6c"} Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.678213 4795 scope.go:117] "RemoveContainer" containerID="a5ffdf754d494e56c98f12d75b2b69f03ff93dc68dc320d72b38bf8cd67db986" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.678980 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vksgm" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.694396 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" event={"ID":"e3b1aeac-562a-4fbe-8b76-3d3143aaeeae","Type":"ContainerDied","Data":"82e40d64e4aa9c65da43d12a968851dce4aa407b924729b1e89dd16d50a02f95"} Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.694567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.723521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.727556 4795 scope.go:117] "RemoveContainer" containerID="49ff7c92330d91d2ce189ef283662c4a6ea7b7cbfe40705fdd5556a3f97db13d" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.731811 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.735352 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vksgm"] Dec 05 08:29:32 crc kubenswrapper[4795]: W1205 08:29:32.739438 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b759500_6b17_4edb_9acc_6c65b2e2efc5.slice/crio-27e2c185545c9b9745b3eb2ad3c1338b83ea2951a016275d437bf62fb406d660 WatchSource:0}: Error finding container 27e2c185545c9b9745b3eb2ad3c1338b83ea2951a016275d437bf62fb406d660: Status 404 returned error can't find the container with id 27e2c185545c9b9745b3eb2ad3c1338b83ea2951a016275d437bf62fb406d660 Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.765409 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5628405-485f-42a4-ba10-db97a6df14b5" path="/var/lib/kubelet/pods/a5628405-485f-42a4-ba10-db97a6df14b5/volumes" Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.783697 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:29:32 crc kubenswrapper[4795]: I1205 08:29:32.790583 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hkgz2"] Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.701439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" event={"ID":"752496ab-e447-4601-8c9e-5c4993ffde4a","Type":"ContainerStarted","Data":"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f"} Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.701777 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.705132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" event={"ID":"4b759500-6b17-4edb-9acc-6c65b2e2efc5","Type":"ContainerStarted","Data":"3fc6534f715e08a585c90a7764037f7caa16c1da7bfc56be547f3f0e7fad6dea"} Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.705165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" event={"ID":"4b759500-6b17-4edb-9acc-6c65b2e2efc5","Type":"ContainerStarted","Data":"27e2c185545c9b9745b3eb2ad3c1338b83ea2951a016275d437bf62fb406d660"} Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.705560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.708891 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.712581 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.769202 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" podStartSLOduration=2.769168844 podStartE2EDuration="2.769168844s" podCreationTimestamp="2025-12-05 08:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:29:33.72356277 +0000 UTC m=+325.296166509" watchObservedRunningTime="2025-12-05 08:29:33.769168844 +0000 UTC m=+325.341772583" Dec 05 08:29:33 crc kubenswrapper[4795]: I1205 08:29:33.785911 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" podStartSLOduration=2.7858886370000002 podStartE2EDuration="2.785888637s" podCreationTimestamp="2025-12-05 08:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:29:33.781912546 +0000 UTC m=+325.354516275" watchObservedRunningTime="2025-12-05 08:29:33.785888637 +0000 UTC m=+325.358492366" Dec 05 08:29:34 crc kubenswrapper[4795]: I1205 08:29:34.755423 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b1aeac-562a-4fbe-8b76-3d3143aaeeae" path="/var/lib/kubelet/pods/e3b1aeac-562a-4fbe-8b76-3d3143aaeeae/volumes" Dec 05 08:29:38 crc kubenswrapper[4795]: I1205 08:29:38.739559 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 08:29:38 crc kubenswrapper[4795]: I1205 08:29:38.745110 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 05 08:29:38 crc kubenswrapper[4795]: I1205 08:29:38.745164 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3b426ebde8fa953cdec7463d3b37afa74ead9e40176f03f73484ab4b183926ed" exitCode=137 Dec 05 08:29:38 crc kubenswrapper[4795]: I1205 08:29:38.745207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3b426ebde8fa953cdec7463d3b37afa74ead9e40176f03f73484ab4b183926ed"} Dec 05 08:29:38 crc kubenswrapper[4795]: I1205 08:29:38.745253 4795 scope.go:117] "RemoveContainer" containerID="3d520d9644311900a8fd126a7eb9f9c546c98f658a14671ec555d9664f13baf0" Dec 05 08:29:39 crc kubenswrapper[4795]: I1205 08:29:39.752735 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 05 08:29:39 crc kubenswrapper[4795]: I1205 08:29:39.754334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"507a658c41f897ede1d7dce6b9b5092f81451ea327ed7fb983a65bfcc8e4156d"} Dec 05 08:29:45 crc kubenswrapper[4795]: I1205 08:29:45.868393 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:29:47 crc kubenswrapper[4795]: I1205 08:29:47.727575 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:29:47 crc kubenswrapper[4795]: I1205 08:29:47.731519 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:29:55 crc kubenswrapper[4795]: I1205 08:29:55.877239 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 05 08:29:56 crc kubenswrapper[4795]: I1205 08:29:56.437185 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" podUID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" containerName="registry" containerID="cri-o://4518276949a843b0206ff095d98d6173ff5a7941be07ca04a5cc1bff846108b8" gracePeriod=30 Dec 05 08:29:56 crc kubenswrapper[4795]: I1205 08:29:56.860819 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" containerID="4518276949a843b0206ff095d98d6173ff5a7941be07ca04a5cc1bff846108b8" exitCode=0 Dec 05 08:29:56 crc kubenswrapper[4795]: I1205 08:29:56.860920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" event={"ID":"8b4264f3-206f-4bb6-ba2f-8ba2fa485060","Type":"ContainerDied","Data":"4518276949a843b0206ff095d98d6173ff5a7941be07ca04a5cc1bff846108b8"} Dec 05 08:29:56 crc kubenswrapper[4795]: I1205 08:29:56.963302 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.008923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009319 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009431 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzxqk\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009533 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.009559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token\") pod \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\" (UID: \"8b4264f3-206f-4bb6-ba2f-8ba2fa485060\") " Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.010481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.011372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.020042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.021270 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk" (OuterVolumeSpecName: "kube-api-access-jzxqk") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "kube-api-access-jzxqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.022156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.024464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.029898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.046199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8b4264f3-206f-4bb6-ba2f-8ba2fa485060" (UID: "8b4264f3-206f-4bb6-ba2f-8ba2fa485060"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111522 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111581 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzxqk\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-kube-api-access-jzxqk\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111594 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111630 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111642 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111653 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.111665 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b4264f3-206f-4bb6-ba2f-8ba2fa485060-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.552112 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.552715 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" podUID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" containerName="route-controller-manager" containerID="cri-o://3fc6534f715e08a585c90a7764037f7caa16c1da7bfc56be547f3f0e7fad6dea" gracePeriod=30 Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.869658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" event={"ID":"8b4264f3-206f-4bb6-ba2f-8ba2fa485060","Type":"ContainerDied","Data":"6d01e12402f478a5a58c8ccb8cef425655ec0c1e4722d9022d83ac755abae530"} Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.869685 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hpn6h" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.869756 4795 scope.go:117] "RemoveContainer" containerID="4518276949a843b0206ff095d98d6173ff5a7941be07ca04a5cc1bff846108b8" Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.872540 4795 generic.go:334] "Generic (PLEG): container finished" podID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" containerID="3fc6534f715e08a585c90a7764037f7caa16c1da7bfc56be547f3f0e7fad6dea" exitCode=0 Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.872603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" event={"ID":"4b759500-6b17-4edb-9acc-6c65b2e2efc5","Type":"ContainerDied","Data":"3fc6534f715e08a585c90a7764037f7caa16c1da7bfc56be547f3f0e7fad6dea"} Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.923646 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:29:57 crc kubenswrapper[4795]: I1205 08:29:57.936200 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hpn6h"] Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.292180 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.325219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config\") pod \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.325298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca\") pod \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.325342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9zh\" (UniqueName: \"kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh\") pod \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.325370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert\") pod \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\" (UID: \"4b759500-6b17-4edb-9acc-6c65b2e2efc5\") " Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.326320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b759500-6b17-4edb-9acc-6c65b2e2efc5" (UID: "4b759500-6b17-4edb-9acc-6c65b2e2efc5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.326421 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config" (OuterVolumeSpecName: "config") pod "4b759500-6b17-4edb-9acc-6c65b2e2efc5" (UID: "4b759500-6b17-4edb-9acc-6c65b2e2efc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.330838 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b759500-6b17-4edb-9acc-6c65b2e2efc5" (UID: "4b759500-6b17-4edb-9acc-6c65b2e2efc5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.335363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh" (OuterVolumeSpecName: "kube-api-access-7c9zh") pod "4b759500-6b17-4edb-9acc-6c65b2e2efc5" (UID: "4b759500-6b17-4edb-9acc-6c65b2e2efc5"). InnerVolumeSpecName "kube-api-access-7c9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.427695 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.427770 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b759500-6b17-4edb-9acc-6c65b2e2efc5-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.427838 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c9zh\" (UniqueName: \"kubernetes.io/projected/4b759500-6b17-4edb-9acc-6c65b2e2efc5-kube-api-access-7c9zh\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.427860 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b759500-6b17-4edb-9acc-6c65b2e2efc5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.757892 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" path="/var/lib/kubelet/pods/8b4264f3-206f-4bb6-ba2f-8ba2fa485060/volumes" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.883243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" event={"ID":"4b759500-6b17-4edb-9acc-6c65b2e2efc5","Type":"ContainerDied","Data":"27e2c185545c9b9745b3eb2ad3c1338b83ea2951a016275d437bf62fb406d660"} Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.883309 4795 scope.go:117] "RemoveContainer" containerID="3fc6534f715e08a585c90a7764037f7caa16c1da7bfc56be547f3f0e7fad6dea" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.883424 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj" Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.904671 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:58 crc kubenswrapper[4795]: I1205 08:29:58.953084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cb4468b7-n5tjj"] Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.515537 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9"] Dec 05 08:29:59 crc kubenswrapper[4795]: E1205 08:29:59.515865 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" containerName="registry" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.515879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" containerName="registry" Dec 05 08:29:59 crc kubenswrapper[4795]: E1205 08:29:59.515896 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" containerName="route-controller-manager" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.515902 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" containerName="route-controller-manager" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.516005 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" containerName="route-controller-manager" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.516017 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4264f3-206f-4bb6-ba2f-8ba2fa485060" containerName="registry" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.516474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.519239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.519805 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.520679 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.520846 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.520938 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.520933 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.539453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9"] Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.544696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-config\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.544754 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-client-ca\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.544826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbnx\" (UniqueName: \"kubernetes.io/projected/05891631-6f0c-4f4c-a385-40b27f881c86-kube-api-access-trbnx\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.544846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05891631-6f0c-4f4c-a385-40b27f881c86-serving-cert\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.646125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-config\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.646198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-client-ca\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.646271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbnx\" (UniqueName: \"kubernetes.io/projected/05891631-6f0c-4f4c-a385-40b27f881c86-kube-api-access-trbnx\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.646303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05891631-6f0c-4f4c-a385-40b27f881c86-serving-cert\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.647762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-client-ca\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.648034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05891631-6f0c-4f4c-a385-40b27f881c86-config\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.652370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05891631-6f0c-4f4c-a385-40b27f881c86-serving-cert\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.668573 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbnx\" (UniqueName: \"kubernetes.io/projected/05891631-6f0c-4f4c-a385-40b27f881c86-kube-api-access-trbnx\") pod \"route-controller-manager-96c7d59c6-n2vk9\" (UID: \"05891631-6f0c-4f4c-a385-40b27f881c86\") " pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:29:59 crc kubenswrapper[4795]: I1205 08:29:59.837501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.169915 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm"] Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.170930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.173220 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.173537 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.193355 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm"] Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.255279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.255372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz8kr\" (UniqueName: \"kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.255398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.286571 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9"] Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.357698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz8kr\" (UniqueName: \"kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.357761 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.357830 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.358830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.366059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.384254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz8kr\" (UniqueName: \"kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr\") pod \"collect-profiles-29415390-gv4vm\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.489176 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.758882 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b759500-6b17-4edb-9acc-6c65b2e2efc5" path="/var/lib/kubelet/pods/4b759500-6b17-4edb-9acc-6c65b2e2efc5/volumes" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.899980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" event={"ID":"05891631-6f0c-4f4c-a385-40b27f881c86","Type":"ContainerStarted","Data":"b99b21bc8ed9380ef80d2705b3c55c9f45f88d244e7ba750c566d7ac14878892"} Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.900045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" event={"ID":"05891631-6f0c-4f4c-a385-40b27f881c86","Type":"ContainerStarted","Data":"71fd7b6dc92f0ba29f9c550c7a78a29426b2c4471a0522c208fff8fbbe68b333"} Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.903356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.977078 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" podStartSLOduration=3.977052833 podStartE2EDuration="3.977052833s" podCreationTimestamp="2025-12-05 08:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:30:00.924304128 +0000 UTC m=+352.496907877" watchObservedRunningTime="2025-12-05 08:30:00.977052833 +0000 UTC m=+352.549656572" Dec 05 08:30:00 crc kubenswrapper[4795]: I1205 08:30:00.981961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm"] Dec 05 08:30:01 crc kubenswrapper[4795]: I1205 08:30:01.296607 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-96c7d59c6-n2vk9" Dec 05 08:30:01 crc kubenswrapper[4795]: I1205 08:30:01.912545 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" containerID="400e3a6765e627751435991cac890e46499f407d086d62de6b9c83223d438cd7" exitCode=0 Dec 05 08:30:01 crc kubenswrapper[4795]: I1205 08:30:01.914948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" event={"ID":"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde","Type":"ContainerDied","Data":"400e3a6765e627751435991cac890e46499f407d086d62de6b9c83223d438cd7"} Dec 05 08:30:01 crc kubenswrapper[4795]: I1205 08:30:01.915140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" event={"ID":"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde","Type":"ContainerStarted","Data":"43639dc001c0e4e6f609037fb7b94f944fc64772cb1eaecce2779bc53f20f9f6"} Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.309527 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.414826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume\") pod \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.414988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume\") pod \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.415092 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz8kr\" (UniqueName: \"kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr\") pod \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\" (UID: \"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde\") " Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.416850 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" (UID: "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.424017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" (UID: "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.424048 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr" (OuterVolumeSpecName: "kube-api-access-wz8kr") pod "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" (UID: "dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde"). InnerVolumeSpecName "kube-api-access-wz8kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.517559 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz8kr\" (UniqueName: \"kubernetes.io/projected/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-kube-api-access-wz8kr\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.517650 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.517670 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.930093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" event={"ID":"dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde","Type":"ContainerDied","Data":"43639dc001c0e4e6f609037fb7b94f944fc64772cb1eaecce2779bc53f20f9f6"} Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.930154 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43639dc001c0e4e6f609037fb7b94f944fc64772cb1eaecce2779bc53f20f9f6" Dec 05 08:30:03 crc kubenswrapper[4795]: I1205 08:30:03.930784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm" Dec 05 08:30:10 crc kubenswrapper[4795]: I1205 08:30:10.827368 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:30:10 crc kubenswrapper[4795]: I1205 08:30:10.827951 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.082270 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.083220 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" podUID="752496ab-e447-4601-8c9e-5c4993ffde4a" containerName="controller-manager" containerID="cri-o://3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f" gracePeriod=30 Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.481113 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.652290 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert\") pod \"752496ab-e447-4601-8c9e-5c4993ffde4a\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.652365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles\") pod \"752496ab-e447-4601-8c9e-5c4993ffde4a\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.652437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config\") pod \"752496ab-e447-4601-8c9e-5c4993ffde4a\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.652459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca\") pod \"752496ab-e447-4601-8c9e-5c4993ffde4a\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.652540 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-758cw\" (UniqueName: \"kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw\") pod \"752496ab-e447-4601-8c9e-5c4993ffde4a\" (UID: \"752496ab-e447-4601-8c9e-5c4993ffde4a\") " Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.653566 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca" (OuterVolumeSpecName: "client-ca") pod "752496ab-e447-4601-8c9e-5c4993ffde4a" (UID: "752496ab-e447-4601-8c9e-5c4993ffde4a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.653666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config" (OuterVolumeSpecName: "config") pod "752496ab-e447-4601-8c9e-5c4993ffde4a" (UID: "752496ab-e447-4601-8c9e-5c4993ffde4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.653717 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "752496ab-e447-4601-8c9e-5c4993ffde4a" (UID: "752496ab-e447-4601-8c9e-5c4993ffde4a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.657877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "752496ab-e447-4601-8c9e-5c4993ffde4a" (UID: "752496ab-e447-4601-8c9e-5c4993ffde4a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.657957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw" (OuterVolumeSpecName: "kube-api-access-758cw") pod "752496ab-e447-4601-8c9e-5c4993ffde4a" (UID: "752496ab-e447-4601-8c9e-5c4993ffde4a"). InnerVolumeSpecName "kube-api-access-758cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.753799 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-758cw\" (UniqueName: \"kubernetes.io/projected/752496ab-e447-4601-8c9e-5c4993ffde4a-kube-api-access-758cw\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.753848 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/752496ab-e447-4601-8c9e-5c4993ffde4a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.753870 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.753892 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:29 crc kubenswrapper[4795]: I1205 08:30:29.753909 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/752496ab-e447-4601-8c9e-5c4993ffde4a-client-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.099253 4795 generic.go:334] "Generic (PLEG): container finished" podID="752496ab-e447-4601-8c9e-5c4993ffde4a" containerID="3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f" exitCode=0 Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.099331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" event={"ID":"752496ab-e447-4601-8c9e-5c4993ffde4a","Type":"ContainerDied","Data":"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f"} Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.099376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" event={"ID":"752496ab-e447-4601-8c9e-5c4993ffde4a","Type":"ContainerDied","Data":"87f036f2845f065d6354c1fa1d9856a076a5ee105736b408bb80ec66ee4967a4"} Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.099406 4795 scope.go:117] "RemoveContainer" containerID="3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.099492 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.138541 4795 scope.go:117] "RemoveContainer" containerID="3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f" Dec 05 08:30:30 crc kubenswrapper[4795]: E1205 08:30:30.140913 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f\": container with ID starting with 3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f not found: ID does not exist" containerID="3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.140978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f"} err="failed to get container status \"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f\": rpc error: code = NotFound desc = could not find container \"3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f\": container with ID starting with 3eae52d5fd3f647089ee262e857d42774d813de96a9af42b10a6eb3dc82aca0f not found: ID does not exist" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.163607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.169813 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c97bdb98c-pqvqh"] Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.609202 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-949d49544-vwq6w"] Dec 05 08:30:30 crc kubenswrapper[4795]: E1205 08:30:30.610058 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752496ab-e447-4601-8c9e-5c4993ffde4a" containerName="controller-manager" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.610091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="752496ab-e447-4601-8c9e-5c4993ffde4a" containerName="controller-manager" Dec 05 08:30:30 crc kubenswrapper[4795]: E1205 08:30:30.610116 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" containerName="collect-profiles" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.610125 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" containerName="collect-profiles" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.610335 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" containerName="collect-profiles" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.610368 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="752496ab-e447-4601-8c9e-5c4993ffde4a" containerName="controller-manager" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.611064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.615161 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.619378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.628120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.628925 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.629039 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.630096 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.631928 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.637001 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-949d49544-vwq6w"] Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.754539 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752496ab-e447-4601-8c9e-5c4993ffde4a" path="/var/lib/kubelet/pods/752496ab-e447-4601-8c9e-5c4993ffde4a/volumes" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.770796 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j552n\" (UniqueName: \"kubernetes.io/projected/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-kube-api-access-j552n\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.770849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-serving-cert\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.770878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-proxy-ca-bundles\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.770927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-client-ca\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.770963 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-config\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.872063 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-serving-cert\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.872354 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-proxy-ca-bundles\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.873731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-client-ca\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.874235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-config\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.874757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j552n\" (UniqueName: \"kubernetes.io/projected/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-kube-api-access-j552n\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.874844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-proxy-ca-bundles\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.875136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-client-ca\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.876504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-config\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.879118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-serving-cert\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.901439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j552n\" (UniqueName: \"kubernetes.io/projected/88f7b6b3-e80e-4c1b-85cf-98a4e625c32a-kube-api-access-j552n\") pod \"controller-manager-949d49544-vwq6w\" (UID: \"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a\") " pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:30 crc kubenswrapper[4795]: I1205 08:30:30.981508 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:31 crc kubenswrapper[4795]: I1205 08:30:31.219156 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-949d49544-vwq6w"] Dec 05 08:30:32 crc kubenswrapper[4795]: I1205 08:30:32.113307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" event={"ID":"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a","Type":"ContainerStarted","Data":"e974c0bc96e031df6f4ec63fd323dafea422786d71317b2578ae47a73e1fb32e"} Dec 05 08:30:32 crc kubenswrapper[4795]: I1205 08:30:32.113790 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" event={"ID":"88f7b6b3-e80e-4c1b-85cf-98a4e625c32a","Type":"ContainerStarted","Data":"b86033f65716b1996e4e3ae545945135bf098d5ca29298f9a937e6b6032d2230"} Dec 05 08:30:32 crc kubenswrapper[4795]: I1205 08:30:32.115014 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:32 crc kubenswrapper[4795]: I1205 08:30:32.123435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" Dec 05 08:30:32 crc kubenswrapper[4795]: I1205 08:30:32.137105 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-949d49544-vwq6w" podStartSLOduration=3.137083479 podStartE2EDuration="3.137083479s" podCreationTimestamp="2025-12-05 08:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:30:32.135039299 +0000 UTC m=+383.707643038" watchObservedRunningTime="2025-12-05 08:30:32.137083479 +0000 UTC m=+383.709687228" Dec 05 08:30:40 crc kubenswrapper[4795]: I1205 08:30:40.827252 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:30:40 crc kubenswrapper[4795]: I1205 08:30:40.828146 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:31:10 crc kubenswrapper[4795]: I1205 08:31:10.827408 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:31:10 crc kubenswrapper[4795]: I1205 08:31:10.828080 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:31:10 crc kubenswrapper[4795]: I1205 08:31:10.828159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:31:10 crc kubenswrapper[4795]: I1205 08:31:10.828903 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:31:10 crc kubenswrapper[4795]: I1205 08:31:10.828965 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453" gracePeriod=600 Dec 05 08:31:11 crc kubenswrapper[4795]: I1205 08:31:11.375268 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453" exitCode=0 Dec 05 08:31:11 crc kubenswrapper[4795]: I1205 08:31:11.375450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453"} Dec 05 08:31:11 crc kubenswrapper[4795]: I1205 08:31:11.376324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168"} Dec 05 08:31:11 crc kubenswrapper[4795]: I1205 08:31:11.376425 4795 scope.go:117] "RemoveContainer" containerID="d5c45dc148d7586afce626b9c768f9977f3c0a106fcf7a34679330c2a256c8ca" Dec 05 08:33:40 crc kubenswrapper[4795]: I1205 08:33:40.827295 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:33:40 crc kubenswrapper[4795]: I1205 08:33:40.828331 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:34:10 crc kubenswrapper[4795]: I1205 08:34:10.827730 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:34:10 crc kubenswrapper[4795]: I1205 08:34:10.828486 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:34:40 crc kubenswrapper[4795]: I1205 08:34:40.828436 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:34:40 crc kubenswrapper[4795]: I1205 08:34:40.829483 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:34:40 crc kubenswrapper[4795]: I1205 08:34:40.829561 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:34:40 crc kubenswrapper[4795]: I1205 08:34:40.830952 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:34:40 crc kubenswrapper[4795]: I1205 08:34:40.831039 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168" gracePeriod=600 Dec 05 08:34:41 crc kubenswrapper[4795]: I1205 08:34:41.941987 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168" exitCode=0 Dec 05 08:34:41 crc kubenswrapper[4795]: I1205 08:34:41.942074 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168"} Dec 05 08:34:41 crc kubenswrapper[4795]: I1205 08:34:41.942494 4795 scope.go:117] "RemoveContainer" containerID="13b0b2039e83205cd0e777e65ca04f19114a91a938417aba534d0de698607453" Dec 05 08:34:42 crc kubenswrapper[4795]: I1205 08:34:42.952382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312"} Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.231120 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bwd4b"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.232821 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.237411 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.237527 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j44df" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.248689 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.262069 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-776ch"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.276427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-776ch" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.285602 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-w69fg" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.285771 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bwd4b"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.296718 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-776ch"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.307589 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-97fmt"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.308406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.310928 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-67zsv" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.326116 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-97fmt"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.352584 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5snwq\" (UniqueName: \"kubernetes.io/projected/f7d36052-3c5d-4bc4-b8c9-82efe88058d7-kube-api-access-5snwq\") pod \"cert-manager-cainjector-7f985d654d-bwd4b\" (UID: \"f7d36052-3c5d-4bc4-b8c9-82efe88058d7\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.453760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvt46\" (UniqueName: \"kubernetes.io/projected/0e5cb2d0-ad47-445f-b16b-e7f05a616aed-kube-api-access-qvt46\") pod \"cert-manager-5b446d88c5-776ch\" (UID: \"0e5cb2d0-ad47-445f-b16b-e7f05a616aed\") " pod="cert-manager/cert-manager-5b446d88c5-776ch" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.453838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5snwq\" (UniqueName: \"kubernetes.io/projected/f7d36052-3c5d-4bc4-b8c9-82efe88058d7-kube-api-access-5snwq\") pod \"cert-manager-cainjector-7f985d654d-bwd4b\" (UID: \"f7d36052-3c5d-4bc4-b8c9-82efe88058d7\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.453877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjjq\" (UniqueName: \"kubernetes.io/projected/b29439d4-ea88-4ced-ae64-e4926a6d9826-kube-api-access-ppjjq\") pod \"cert-manager-webhook-5655c58dd6-97fmt\" (UID: \"b29439d4-ea88-4ced-ae64-e4926a6d9826\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.485258 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5snwq\" (UniqueName: \"kubernetes.io/projected/f7d36052-3c5d-4bc4-b8c9-82efe88058d7-kube-api-access-5snwq\") pod \"cert-manager-cainjector-7f985d654d-bwd4b\" (UID: \"f7d36052-3c5d-4bc4-b8c9-82efe88058d7\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.555370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvt46\" (UniqueName: \"kubernetes.io/projected/0e5cb2d0-ad47-445f-b16b-e7f05a616aed-kube-api-access-qvt46\") pod \"cert-manager-5b446d88c5-776ch\" (UID: \"0e5cb2d0-ad47-445f-b16b-e7f05a616aed\") " pod="cert-manager/cert-manager-5b446d88c5-776ch" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.555440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjjq\" (UniqueName: \"kubernetes.io/projected/b29439d4-ea88-4ced-ae64-e4926a6d9826-kube-api-access-ppjjq\") pod \"cert-manager-webhook-5655c58dd6-97fmt\" (UID: \"b29439d4-ea88-4ced-ae64-e4926a6d9826\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.575152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvt46\" (UniqueName: \"kubernetes.io/projected/0e5cb2d0-ad47-445f-b16b-e7f05a616aed-kube-api-access-qvt46\") pod \"cert-manager-5b446d88c5-776ch\" (UID: \"0e5cb2d0-ad47-445f-b16b-e7f05a616aed\") " pod="cert-manager/cert-manager-5b446d88c5-776ch" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.578445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjjq\" (UniqueName: \"kubernetes.io/projected/b29439d4-ea88-4ced-ae64-e4926a6d9826-kube-api-access-ppjjq\") pod \"cert-manager-webhook-5655c58dd6-97fmt\" (UID: \"b29439d4-ea88-4ced-ae64-e4926a6d9826\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.581966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.609523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-776ch" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.630054 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.884468 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bwd4b"] Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.906064 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.967530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-776ch"] Dec 05 08:35:28 crc kubenswrapper[4795]: W1205 08:35:28.973424 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5cb2d0_ad47_445f_b16b_e7f05a616aed.slice/crio-409c1dd98662a110baf4c2e4b159d9a8d5423ab14276159c3a3a1adf97b2327f WatchSource:0}: Error finding container 409c1dd98662a110baf4c2e4b159d9a8d5423ab14276159c3a3a1adf97b2327f: Status 404 returned error can't find the container with id 409c1dd98662a110baf4c2e4b159d9a8d5423ab14276159c3a3a1adf97b2327f Dec 05 08:35:28 crc kubenswrapper[4795]: I1205 08:35:28.988742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-97fmt"] Dec 05 08:35:29 crc kubenswrapper[4795]: I1205 08:35:29.249218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" event={"ID":"b29439d4-ea88-4ced-ae64-e4926a6d9826","Type":"ContainerStarted","Data":"cb7baf7ea489e9966b684d74910947dfabea733ec234301bd29d5d0c6c407e18"} Dec 05 08:35:29 crc kubenswrapper[4795]: I1205 08:35:29.249966 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" event={"ID":"f7d36052-3c5d-4bc4-b8c9-82efe88058d7","Type":"ContainerStarted","Data":"a0ba2f6aea42e3181cafdd75924965fa3c727d603a2d150c461cf00c6c5c6a4f"} Dec 05 08:35:29 crc kubenswrapper[4795]: I1205 08:35:29.250730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-776ch" event={"ID":"0e5cb2d0-ad47-445f-b16b-e7f05a616aed","Type":"ContainerStarted","Data":"409c1dd98662a110baf4c2e4b159d9a8d5423ab14276159c3a3a1adf97b2327f"} Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.279556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" event={"ID":"f7d36052-3c5d-4bc4-b8c9-82efe88058d7","Type":"ContainerStarted","Data":"bda7164e1e6cdc6543e2db0f04992c1b98e6cf44dc3318978d8f867eda9f1d3c"} Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.282700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-776ch" event={"ID":"0e5cb2d0-ad47-445f-b16b-e7f05a616aed","Type":"ContainerStarted","Data":"67a6c32135e1d387ed2ed2a8e23cf81de022df110b295d6e1ba1c7606fafca65"} Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.285154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" event={"ID":"b29439d4-ea88-4ced-ae64-e4926a6d9826","Type":"ContainerStarted","Data":"51685de13623b811b3b52309e911348b2473d7dcf8bda239e7bac6923de50954"} Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.285304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.296201 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-bwd4b" podStartSLOduration=1.403889232 podStartE2EDuration="5.296185014s" podCreationTimestamp="2025-12-05 08:35:28 +0000 UTC" firstStartedPulling="2025-12-05 08:35:28.898039576 +0000 UTC m=+680.470643315" lastFinishedPulling="2025-12-05 08:35:32.790335358 +0000 UTC m=+684.362939097" observedRunningTime="2025-12-05 08:35:33.2936686 +0000 UTC m=+684.866272349" watchObservedRunningTime="2025-12-05 08:35:33.296185014 +0000 UTC m=+684.868788753" Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.316182 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-776ch" podStartSLOduration=1.501890546 podStartE2EDuration="5.316164695s" podCreationTimestamp="2025-12-05 08:35:28 +0000 UTC" firstStartedPulling="2025-12-05 08:35:28.976062249 +0000 UTC m=+680.548665988" lastFinishedPulling="2025-12-05 08:35:32.790336398 +0000 UTC m=+684.362940137" observedRunningTime="2025-12-05 08:35:33.31209812 +0000 UTC m=+684.884701859" watchObservedRunningTime="2025-12-05 08:35:33.316164695 +0000 UTC m=+684.888768434" Dec 05 08:35:33 crc kubenswrapper[4795]: I1205 08:35:33.347263 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" podStartSLOduration=1.497577756 podStartE2EDuration="5.347234519s" podCreationTimestamp="2025-12-05 08:35:28 +0000 UTC" firstStartedPulling="2025-12-05 08:35:28.995740722 +0000 UTC m=+680.568344461" lastFinishedPulling="2025-12-05 08:35:32.845397485 +0000 UTC m=+684.418001224" observedRunningTime="2025-12-05 08:35:33.342037566 +0000 UTC m=+684.914641305" watchObservedRunningTime="2025-12-05 08:35:33.347234519 +0000 UTC m=+684.919838268" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.369673 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xl8v5"] Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370374 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-controller" containerID="cri-o://171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370477 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="nbdb" containerID="cri-o://ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370507 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-node" containerID="cri-o://1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370525 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-acl-logging" containerID="cri-o://ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370555 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="sbdb" containerID="cri-o://bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370804 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="northd" containerID="cri-o://c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.370830 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.421566 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" containerID="cri-o://d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" gracePeriod=30 Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.632882 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-97fmt" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.735302 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/3.log" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.737638 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovn-acl-logging/0.log" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.738252 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovn-controller/0.log" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.738817 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.795225 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7f982"] Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.795872 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="northd" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.795952 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="northd" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796017 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="sbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796088 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="sbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796145 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kubecfg-setup" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kubecfg-setup" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796254 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796360 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="nbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796439 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="nbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796526 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796589 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796673 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796725 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796775 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-node" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796830 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-node" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796886 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.796941 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.796998 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797050 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.797114 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-acl-logging" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797181 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-acl-logging" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.797235 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797285 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797435 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="nbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797549 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-acl-logging" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797604 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="northd" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797687 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-node" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797745 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797820 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797869 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="kube-rbac-proxy-ovn-metrics" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797925 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.797972 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovn-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.798032 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="sbdb" Dec 05 08:35:38 crc kubenswrapper[4795]: E1205 08:35:38.798186 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.798240 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.798411 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerName="ovnkube-controller" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.800175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.893958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894014 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78k86\" (UniqueName: \"kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894171 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894289 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894318 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides\") pod \"cfece70d-6476-4442-bcc6-8ee82a8330c1\" (UID: \"cfece70d-6476-4442-bcc6-8ee82a8330c1\") " Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894785 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket" (OuterVolumeSpecName: "log-socket") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894864 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash" (OuterVolumeSpecName: "host-slash") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894883 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.894890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log" (OuterVolumeSpecName: "node-log") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.895948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.896216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.901767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.901936 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86" (OuterVolumeSpecName: "kube-api-access-78k86") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "kube-api-access-78k86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.914669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "cfece70d-6476-4442-bcc6-8ee82a8330c1" (UID: "cfece70d-6476-4442-bcc6-8ee82a8330c1"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996311 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-netd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-var-lib-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxx5t\" (UniqueName: \"kubernetes.io/projected/c05132bb-7a08-40c9-8df0-b0f61057dd88-kube-api-access-xxx5t\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996444 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-kubelet\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-systemd-units\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-ovn\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-etc-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovn-node-metrics-cert\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-env-overrides\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996703 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-systemd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996730 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-log-socket\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-script-lib\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-node-log\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996854 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-slash\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-config\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996921 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-netns\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.996943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-bin\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997011 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997029 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997049 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997062 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997075 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997086 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997099 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78k86\" (UniqueName: \"kubernetes.io/projected/cfece70d-6476-4442-bcc6-8ee82a8330c1-kube-api-access-78k86\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997115 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997127 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997139 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997150 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-node-log\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997162 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cfece70d-6476-4442-bcc6-8ee82a8330c1-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997175 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-log-socket\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997187 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997199 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997209 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-slash\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997221 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997233 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997245 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:38 crc kubenswrapper[4795]: I1205 08:35:38.997257 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cfece70d-6476-4442-bcc6-8ee82a8330c1-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.097885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-node-log\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.097928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.097947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-slash\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.097965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-config\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.097983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-netns\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-bin\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-node-log\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-netd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-netd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-var-lib-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxx5t\" (UniqueName: \"kubernetes.io/projected/c05132bb-7a08-40c9-8df0-b0f61057dd88-kube-api-access-xxx5t\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-cni-bin\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098173 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-kubelet\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098143 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-slash\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-systemd-units\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098227 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-var-lib-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-systemd-units\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-ovn\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098268 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-ovn\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-kubelet\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-netns\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-etc-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovn-node-metrics-cert\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-env-overrides\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-systemd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-log-socket\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-script-lib\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098570 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-host-run-ovn-kubernetes\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-run-systemd\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-log-socket\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-config\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.098765 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c05132bb-7a08-40c9-8df0-b0f61057dd88-etc-openvswitch\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.099449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovnkube-script-lib\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.100060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c05132bb-7a08-40c9-8df0-b0f61057dd88-env-overrides\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.102072 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c05132bb-7a08-40c9-8df0-b0f61057dd88-ovn-node-metrics-cert\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.117888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxx5t\" (UniqueName: \"kubernetes.io/projected/c05132bb-7a08-40c9-8df0-b0f61057dd88-kube-api-access-xxx5t\") pod \"ovnkube-node-7f982\" (UID: \"c05132bb-7a08-40c9-8df0-b0f61057dd88\") " pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.324126 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/2.log" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.324961 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/1.log" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.324996 4795 generic.go:334] "Generic (PLEG): container finished" podID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" containerID="863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39" exitCode=2 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.325061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerDied","Data":"863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.325110 4795 scope.go:117] "RemoveContainer" containerID="927f2ae836acd6dc1a21ec1674c3bcda16fb034ef9c23c82d951821a14e3ca46" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.325719 4795 scope.go:117] "RemoveContainer" containerID="863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.326084 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bhxnf_openshift-multus(9dd42ab7-1f98-4f50-ae12-15ec6587bc4e)\"" pod="openshift-multus/multus-bhxnf" podUID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.334185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovnkube-controller/3.log" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.338090 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovn-acl-logging/0.log" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.339404 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xl8v5_cfece70d-6476-4442-bcc6-8ee82a8330c1/ovn-controller/0.log" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340250 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340319 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340334 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340346 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340355 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340365 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" exitCode=0 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340403 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" exitCode=143 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340427 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfece70d-6476-4442-bcc6-8ee82a8330c1" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" exitCode=143 Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340777 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340820 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340832 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340842 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340852 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340860 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340892 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340903 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340910 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340917 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340945 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340985 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.340997 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341005 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341012 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341020 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341028 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341061 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341069 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341077 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341105 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341175 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341188 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341196 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341209 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341217 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341257 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341268 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341276 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341283 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" event={"ID":"cfece70d-6476-4442-bcc6-8ee82a8330c1","Type":"ContainerDied","Data":"630d1bf9bd3891171ba8e1bb2893eb306f3c430e6b7c35654215ce270ee7288b"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341342 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341354 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341362 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341370 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341378 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341386 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341428 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341447 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341456 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341464 4795 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.341917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xl8v5" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.405712 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xl8v5"] Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.408311 4795 scope.go:117] "RemoveContainer" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.410578 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xl8v5"] Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.415258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.436822 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.474542 4795 scope.go:117] "RemoveContainer" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.548152 4795 scope.go:117] "RemoveContainer" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.564073 4795 scope.go:117] "RemoveContainer" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.579414 4795 scope.go:117] "RemoveContainer" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.598353 4795 scope.go:117] "RemoveContainer" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.661984 4795 scope.go:117] "RemoveContainer" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.680237 4795 scope.go:117] "RemoveContainer" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.696602 4795 scope.go:117] "RemoveContainer" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.713694 4795 scope.go:117] "RemoveContainer" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.714253 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": container with ID starting with d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace not found: ID does not exist" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.714317 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} err="failed to get container status \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": rpc error: code = NotFound desc = could not find container \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": container with ID starting with d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.714354 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.714941 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": container with ID starting with 5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895 not found: ID does not exist" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.714994 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} err="failed to get container status \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": rpc error: code = NotFound desc = could not find container \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": container with ID starting with 5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.715033 4795 scope.go:117] "RemoveContainer" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.715487 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": container with ID starting with bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863 not found: ID does not exist" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.715523 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} err="failed to get container status \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": rpc error: code = NotFound desc = could not find container \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": container with ID starting with bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.715549 4795 scope.go:117] "RemoveContainer" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.717357 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": container with ID starting with ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60 not found: ID does not exist" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.717392 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} err="failed to get container status \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": rpc error: code = NotFound desc = could not find container \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": container with ID starting with ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.717414 4795 scope.go:117] "RemoveContainer" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.717936 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": container with ID starting with c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d not found: ID does not exist" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.717971 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} err="failed to get container status \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": rpc error: code = NotFound desc = could not find container \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": container with ID starting with c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.717991 4795 scope.go:117] "RemoveContainer" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.718359 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": container with ID starting with b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0 not found: ID does not exist" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.718392 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} err="failed to get container status \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": rpc error: code = NotFound desc = could not find container \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": container with ID starting with b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.718411 4795 scope.go:117] "RemoveContainer" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.718722 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": container with ID starting with 1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd not found: ID does not exist" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.718758 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} err="failed to get container status \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": rpc error: code = NotFound desc = could not find container \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": container with ID starting with 1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.718784 4795 scope.go:117] "RemoveContainer" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.719116 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": container with ID starting with ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68 not found: ID does not exist" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.719148 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} err="failed to get container status \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": rpc error: code = NotFound desc = could not find container \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": container with ID starting with ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.719174 4795 scope.go:117] "RemoveContainer" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.719535 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": container with ID starting with 171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9 not found: ID does not exist" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.719601 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} err="failed to get container status \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": rpc error: code = NotFound desc = could not find container \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": container with ID starting with 171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.719657 4795 scope.go:117] "RemoveContainer" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: E1205 08:35:39.720101 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": container with ID starting with b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089 not found: ID does not exist" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720136 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} err="failed to get container status \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": rpc error: code = NotFound desc = could not find container \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": container with ID starting with b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720158 4795 scope.go:117] "RemoveContainer" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720440 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} err="failed to get container status \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": rpc error: code = NotFound desc = could not find container \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": container with ID starting with d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720479 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720815 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} err="failed to get container status \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": rpc error: code = NotFound desc = could not find container \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": container with ID starting with 5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.720846 4795 scope.go:117] "RemoveContainer" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.721397 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} err="failed to get container status \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": rpc error: code = NotFound desc = could not find container \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": container with ID starting with bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.721441 4795 scope.go:117] "RemoveContainer" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.721726 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} err="failed to get container status \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": rpc error: code = NotFound desc = could not find container \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": container with ID starting with ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.721757 4795 scope.go:117] "RemoveContainer" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.722949 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} err="failed to get container status \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": rpc error: code = NotFound desc = could not find container \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": container with ID starting with c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.723036 4795 scope.go:117] "RemoveContainer" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.723510 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} err="failed to get container status \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": rpc error: code = NotFound desc = could not find container \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": container with ID starting with b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.723534 4795 scope.go:117] "RemoveContainer" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.723918 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} err="failed to get container status \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": rpc error: code = NotFound desc = could not find container \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": container with ID starting with 1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.723989 4795 scope.go:117] "RemoveContainer" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.724417 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} err="failed to get container status \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": rpc error: code = NotFound desc = could not find container \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": container with ID starting with ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.724444 4795 scope.go:117] "RemoveContainer" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.724852 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} err="failed to get container status \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": rpc error: code = NotFound desc = could not find container \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": container with ID starting with 171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.724884 4795 scope.go:117] "RemoveContainer" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.725166 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} err="failed to get container status \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": rpc error: code = NotFound desc = could not find container \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": container with ID starting with b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.725213 4795 scope.go:117] "RemoveContainer" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.726011 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} err="failed to get container status \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": rpc error: code = NotFound desc = could not find container \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": container with ID starting with d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.726049 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.726711 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} err="failed to get container status \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": rpc error: code = NotFound desc = could not find container \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": container with ID starting with 5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.726755 4795 scope.go:117] "RemoveContainer" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727094 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} err="failed to get container status \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": rpc error: code = NotFound desc = could not find container \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": container with ID starting with bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727128 4795 scope.go:117] "RemoveContainer" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727491 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} err="failed to get container status \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": rpc error: code = NotFound desc = could not find container \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": container with ID starting with ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727537 4795 scope.go:117] "RemoveContainer" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727891 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} err="failed to get container status \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": rpc error: code = NotFound desc = could not find container \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": container with ID starting with c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.727957 4795 scope.go:117] "RemoveContainer" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.728258 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} err="failed to get container status \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": rpc error: code = NotFound desc = could not find container \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": container with ID starting with b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.728302 4795 scope.go:117] "RemoveContainer" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.728972 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} err="failed to get container status \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": rpc error: code = NotFound desc = could not find container \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": container with ID starting with 1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.729004 4795 scope.go:117] "RemoveContainer" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.729285 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} err="failed to get container status \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": rpc error: code = NotFound desc = could not find container \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": container with ID starting with ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.729305 4795 scope.go:117] "RemoveContainer" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.729691 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} err="failed to get container status \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": rpc error: code = NotFound desc = could not find container \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": container with ID starting with 171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.729736 4795 scope.go:117] "RemoveContainer" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.730113 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} err="failed to get container status \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": rpc error: code = NotFound desc = could not find container \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": container with ID starting with b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.730190 4795 scope.go:117] "RemoveContainer" containerID="d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.731411 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace"} err="failed to get container status \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": rpc error: code = NotFound desc = could not find container \"d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace\": container with ID starting with d3af3fe4c21c42d9bd37ccb076ccbbcd64e3131f4ea72480daa7393fff4b1ace not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.731451 4795 scope.go:117] "RemoveContainer" containerID="5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.731978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895"} err="failed to get container status \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": rpc error: code = NotFound desc = could not find container \"5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895\": container with ID starting with 5c6302c9a101e264c9e44bb8c0d82dd0b94a38e2da0c9909918d1be58f043895 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.732078 4795 scope.go:117] "RemoveContainer" containerID="bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.732511 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863"} err="failed to get container status \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": rpc error: code = NotFound desc = could not find container \"bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863\": container with ID starting with bdf8fcc9ac9fac18b27bd9c5fe3093f43449f781b416ad98d11b70a6667b0863 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.732586 4795 scope.go:117] "RemoveContainer" containerID="ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.732943 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60"} err="failed to get container status \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": rpc error: code = NotFound desc = could not find container \"ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60\": container with ID starting with ff349cffbdf505e7740585ebc29cf5ce3689361a1e952dd73de2514bfb72be60 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.732996 4795 scope.go:117] "RemoveContainer" containerID="c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.733438 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d"} err="failed to get container status \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": rpc error: code = NotFound desc = could not find container \"c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d\": container with ID starting with c8b2ac974b11aac99cb23a0c4aff2142e1b2ba8bede3f5361e7a3769ea8ca22d not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.733460 4795 scope.go:117] "RemoveContainer" containerID="b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.733761 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0"} err="failed to get container status \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": rpc error: code = NotFound desc = could not find container \"b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0\": container with ID starting with b289f95f0b5054cc5d3347d082252a3a54263da610d09bbb5a5c63820a8999a0 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.733792 4795 scope.go:117] "RemoveContainer" containerID="1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735129 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd"} err="failed to get container status \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": rpc error: code = NotFound desc = could not find container \"1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd\": container with ID starting with 1f223bf3c5298b70cd9840d374f0b244402f358bb5c46712939111eead706cbd not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735152 4795 scope.go:117] "RemoveContainer" containerID="ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735487 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68"} err="failed to get container status \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": rpc error: code = NotFound desc = could not find container \"ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68\": container with ID starting with ec101434d7c4922686aff2787ecdbdb7ea7cb817b16d25ad40809b3bd51f0d68 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735507 4795 scope.go:117] "RemoveContainer" containerID="171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735823 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9"} err="failed to get container status \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": rpc error: code = NotFound desc = could not find container \"171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9\": container with ID starting with 171663cab2b3e0fc037eb269cf182b2463e307e66304d626aca41d24ccd162b9 not found: ID does not exist" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.735843 4795 scope.go:117] "RemoveContainer" containerID="b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089" Dec 05 08:35:39 crc kubenswrapper[4795]: I1205 08:35:39.736199 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089"} err="failed to get container status \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": rpc error: code = NotFound desc = could not find container \"b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089\": container with ID starting with b0639c9f0beb6c5db3147096ca8a33de5749f5b9202cad2cfd0f48b5612f0089 not found: ID does not exist" Dec 05 08:35:40 crc kubenswrapper[4795]: I1205 08:35:40.350774 4795 generic.go:334] "Generic (PLEG): container finished" podID="c05132bb-7a08-40c9-8df0-b0f61057dd88" containerID="153fbc446337173f7701d3d357fba74315a3cb63a068e90966c1747c1a79f258" exitCode=0 Dec 05 08:35:40 crc kubenswrapper[4795]: I1205 08:35:40.350864 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerDied","Data":"153fbc446337173f7701d3d357fba74315a3cb63a068e90966c1747c1a79f258"} Dec 05 08:35:40 crc kubenswrapper[4795]: I1205 08:35:40.350941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"da7d6b32fe2082126750efcdb92a1fdbb4a8f7a41b1705cc2005338cf8221807"} Dec 05 08:35:40 crc kubenswrapper[4795]: I1205 08:35:40.352994 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/2.log" Dec 05 08:35:40 crc kubenswrapper[4795]: I1205 08:35:40.754058 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfece70d-6476-4442-bcc6-8ee82a8330c1" path="/var/lib/kubelet/pods/cfece70d-6476-4442-bcc6-8ee82a8330c1/volumes" Dec 05 08:35:41 crc kubenswrapper[4795]: I1205 08:35:41.363019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"318c86a58dbae9921b4ca11b84a70166382d78ffa01c4b633030f9c091a81062"} Dec 05 08:35:41 crc kubenswrapper[4795]: I1205 08:35:41.363517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"27750673ee8eb58292e0f66606493115163f77322e44fbb79c5f7ebd9768922c"} Dec 05 08:35:41 crc kubenswrapper[4795]: I1205 08:35:41.363534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"f7b3fcfef5323e84517faf81ec2dac9c1788d52ea99e9b26af41d045ac3fc5da"} Dec 05 08:35:42 crc kubenswrapper[4795]: I1205 08:35:42.373153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"2d50117ce752d663ba7c7760d0e53cec9334f6197d11669046b429a1585affaa"} Dec 05 08:35:42 crc kubenswrapper[4795]: I1205 08:35:42.373538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"f6c6c68901e36654c1db7ff6590e61bba1fe61fc3416972039c9d7da4e4f637e"} Dec 05 08:35:42 crc kubenswrapper[4795]: I1205 08:35:42.373553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"cfe1cb9142e680ae714d6a7d3ff5b290468cae8e7a770ea646eece392e6b1cd4"} Dec 05 08:35:44 crc kubenswrapper[4795]: I1205 08:35:44.403545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"451a5c9e6e79fb6b972395037ab9549609e04c61e2bdcdfc54b307ccf3d7f89b"} Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.426881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" event={"ID":"c05132bb-7a08-40c9-8df0-b0f61057dd88","Type":"ContainerStarted","Data":"108acfc2c09d62373fbe7e3d4078731e004080076795c2e1553c5f024ff18aa0"} Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.428412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.428446 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.428490 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.474742 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" podStartSLOduration=8.474711615 podStartE2EDuration="8.474711615s" podCreationTimestamp="2025-12-05 08:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:35:46.473437281 +0000 UTC m=+698.046041020" watchObservedRunningTime="2025-12-05 08:35:46.474711615 +0000 UTC m=+698.047315354" Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.494362 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:46 crc kubenswrapper[4795]: I1205 08:35:46.495048 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:35:50 crc kubenswrapper[4795]: I1205 08:35:50.747369 4795 scope.go:117] "RemoveContainer" containerID="863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39" Dec 05 08:35:50 crc kubenswrapper[4795]: E1205 08:35:50.748289 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bhxnf_openshift-multus(9dd42ab7-1f98-4f50-ae12-15ec6587bc4e)\"" pod="openshift-multus/multus-bhxnf" podUID="9dd42ab7-1f98-4f50-ae12-15ec6587bc4e" Dec 05 08:36:05 crc kubenswrapper[4795]: I1205 08:36:05.747352 4795 scope.go:117] "RemoveContainer" containerID="863f2c68098a900a1d311fea196bad6095f72bce77173f4e437e7e983cb49f39" Dec 05 08:36:06 crc kubenswrapper[4795]: I1205 08:36:06.554417 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhxnf_9dd42ab7-1f98-4f50-ae12-15ec6587bc4e/kube-multus/2.log" Dec 05 08:36:06 crc kubenswrapper[4795]: I1205 08:36:06.554910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhxnf" event={"ID":"9dd42ab7-1f98-4f50-ae12-15ec6587bc4e","Type":"ContainerStarted","Data":"1c791784668534e36f382f385b3963445661f95bd502ffc790955d0b14f44408"} Dec 05 08:36:09 crc kubenswrapper[4795]: I1205 08:36:09.449813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7f982" Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.890590 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w"] Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.892455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.897011 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.907080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w"] Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.948784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn69\" (UniqueName: \"kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.948836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:21 crc kubenswrapper[4795]: I1205 08:36:21.948912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.049604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.049691 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.049748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn69\" (UniqueName: \"kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.050267 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.050330 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.080452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn69\" (UniqueName: \"kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.219561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.435296 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w"] Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.755032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerStarted","Data":"94f665800e9458305095bb99ce158d46ae5e4934a114e666c0e8f5162b1dbea2"} Dec 05 08:36:22 crc kubenswrapper[4795]: I1205 08:36:22.755096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerStarted","Data":"46b37ba98aff1c8944e12ee974944c4d79716d21fe51bc50e486a0e5cf8709e7"} Dec 05 08:36:23 crc kubenswrapper[4795]: I1205 08:36:23.958916 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerID="94f665800e9458305095bb99ce158d46ae5e4934a114e666c0e8f5162b1dbea2" exitCode=0 Dec 05 08:36:23 crc kubenswrapper[4795]: I1205 08:36:23.958990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerDied","Data":"94f665800e9458305095bb99ce158d46ae5e4934a114e666c0e8f5162b1dbea2"} Dec 05 08:36:27 crc kubenswrapper[4795]: I1205 08:36:27.986602 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerID="2879e24c9a5797f6a5c17711fb18ef4595a6564413ba0e57eed30ec79d6492e2" exitCode=0 Dec 05 08:36:27 crc kubenswrapper[4795]: I1205 08:36:27.987069 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerDied","Data":"2879e24c9a5797f6a5c17711fb18ef4595a6564413ba0e57eed30ec79d6492e2"} Dec 05 08:36:28 crc kubenswrapper[4795]: I1205 08:36:28.994754 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerID="a632f7160c15d079ed13605a14fcd45c86f902c4dccbc01446a7fd5346cea614" exitCode=0 Dec 05 08:36:28 crc kubenswrapper[4795]: I1205 08:36:28.994858 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerDied","Data":"a632f7160c15d079ed13605a14fcd45c86f902c4dccbc01446a7fd5346cea614"} Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.282257 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.364669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle\") pod \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.364746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsn69\" (UniqueName: \"kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69\") pod \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.364836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util\") pod \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\" (UID: \"ff41acb4-73bc-4c91-9556-ef40bd698fd1\") " Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.365653 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle" (OuterVolumeSpecName: "bundle") pod "ff41acb4-73bc-4c91-9556-ef40bd698fd1" (UID: "ff41acb4-73bc-4c91-9556-ef40bd698fd1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.371592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69" (OuterVolumeSpecName: "kube-api-access-hsn69") pod "ff41acb4-73bc-4c91-9556-ef40bd698fd1" (UID: "ff41acb4-73bc-4c91-9556-ef40bd698fd1"). InnerVolumeSpecName "kube-api-access-hsn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.375638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util" (OuterVolumeSpecName: "util") pod "ff41acb4-73bc-4c91-9556-ef40bd698fd1" (UID: "ff41acb4-73bc-4c91-9556-ef40bd698fd1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.466192 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-util\") on node \"crc\" DevicePath \"\"" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.466229 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff41acb4-73bc-4c91-9556-ef40bd698fd1-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:36:30 crc kubenswrapper[4795]: I1205 08:36:30.466240 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsn69\" (UniqueName: \"kubernetes.io/projected/ff41acb4-73bc-4c91-9556-ef40bd698fd1-kube-api-access-hsn69\") on node \"crc\" DevicePath \"\"" Dec 05 08:36:31 crc kubenswrapper[4795]: I1205 08:36:31.013999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" event={"ID":"ff41acb4-73bc-4c91-9556-ef40bd698fd1","Type":"ContainerDied","Data":"46b37ba98aff1c8944e12ee974944c4d79716d21fe51bc50e486a0e5cf8709e7"} Dec 05 08:36:31 crc kubenswrapper[4795]: I1205 08:36:31.014082 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b37ba98aff1c8944e12ee974944c4d79716d21fe51bc50e486a0e5cf8709e7" Dec 05 08:36:31 crc kubenswrapper[4795]: I1205 08:36:31.014131 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.526607 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz"] Dec 05 08:36:33 crc kubenswrapper[4795]: E1205 08:36:33.527196 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="extract" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.527209 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="extract" Dec 05 08:36:33 crc kubenswrapper[4795]: E1205 08:36:33.527227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="pull" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.527239 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="pull" Dec 05 08:36:33 crc kubenswrapper[4795]: E1205 08:36:33.527251 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="util" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.527260 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="util" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.527373 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff41acb4-73bc-4c91-9556-ef40bd698fd1" containerName="extract" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.527834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.530858 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.531086 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rjsrk" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.531260 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.545778 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz"] Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.610573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sks59\" (UniqueName: \"kubernetes.io/projected/d640de60-a6a0-4c76-8fa4-4370edd7363b-kube-api-access-sks59\") pod \"nmstate-operator-5b5b58f5c8-znzrz\" (UID: \"d640de60-a6a0-4c76-8fa4-4370edd7363b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.712440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sks59\" (UniqueName: \"kubernetes.io/projected/d640de60-a6a0-4c76-8fa4-4370edd7363b-kube-api-access-sks59\") pod \"nmstate-operator-5b5b58f5c8-znzrz\" (UID: \"d640de60-a6a0-4c76-8fa4-4370edd7363b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.737915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sks59\" (UniqueName: \"kubernetes.io/projected/d640de60-a6a0-4c76-8fa4-4370edd7363b-kube-api-access-sks59\") pod \"nmstate-operator-5b5b58f5c8-znzrz\" (UID: \"d640de60-a6a0-4c76-8fa4-4370edd7363b\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" Dec 05 08:36:33 crc kubenswrapper[4795]: I1205 08:36:33.866474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" Dec 05 08:36:34 crc kubenswrapper[4795]: I1205 08:36:34.247293 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz"] Dec 05 08:36:35 crc kubenswrapper[4795]: I1205 08:36:35.062452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" event={"ID":"d640de60-a6a0-4c76-8fa4-4370edd7363b","Type":"ContainerStarted","Data":"8fedd7feb4478f4f6f96ef2381035e9f70800a59576a66fa7c332fc86eb3ef0a"} Dec 05 08:36:38 crc kubenswrapper[4795]: I1205 08:36:38.082931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" event={"ID":"d640de60-a6a0-4c76-8fa4-4370edd7363b","Type":"ContainerStarted","Data":"4109d11f46a4a5c987467238f73b78586c8a2077d7639d42c59a03dc432a5d80"} Dec 05 08:36:38 crc kubenswrapper[4795]: I1205 08:36:38.109101 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-znzrz" podStartSLOduration=2.394677764 podStartE2EDuration="5.109075284s" podCreationTimestamp="2025-12-05 08:36:33 +0000 UTC" firstStartedPulling="2025-12-05 08:36:34.267764677 +0000 UTC m=+745.840368406" lastFinishedPulling="2025-12-05 08:36:36.982162187 +0000 UTC m=+748.554765926" observedRunningTime="2025-12-05 08:36:38.102973875 +0000 UTC m=+749.675577614" watchObservedRunningTime="2025-12-05 08:36:38.109075284 +0000 UTC m=+749.681679023" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.203781 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.205458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.210629 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-s4tpf" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.216482 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.252435 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.253181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.254182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x268p\" (UniqueName: \"kubernetes.io/projected/347909b2-eaf4-4b55-becf-cde638716053-kube-api-access-x268p\") pod \"nmstate-metrics-7f946cbc9-4zm26\" (UID: \"347909b2-eaf4-4b55-becf-cde638716053\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.262239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.277260 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.313471 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fhm2n"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.314383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355036 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-nmstate-lock\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x268p\" (UniqueName: \"kubernetes.io/projected/347909b2-eaf4-4b55-becf-cde638716053-kube-api-access-x268p\") pod \"nmstate-metrics-7f946cbc9-4zm26\" (UID: \"347909b2-eaf4-4b55-becf-cde638716053\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfqlj\" (UniqueName: \"kubernetes.io/projected/8b96bb54-83d3-4c36-a347-bad33a85e746-kube-api-access-zfqlj\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-ovs-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-dbus-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.355842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5shn\" (UniqueName: \"kubernetes.io/projected/8513d59a-88fb-4414-bfa8-e5fcfc599cca-kube-api-access-j5shn\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.379768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x268p\" (UniqueName: \"kubernetes.io/projected/347909b2-eaf4-4b55-becf-cde638716053-kube-api-access-x268p\") pod \"nmstate-metrics-7f946cbc9-4zm26\" (UID: \"347909b2-eaf4-4b55-becf-cde638716053\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.441943 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.442731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.444813 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.446120 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.452404 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8g4vn" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457724 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-nmstate-lock\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfqlj\" (UniqueName: \"kubernetes.io/projected/8b96bb54-83d3-4c36-a347-bad33a85e746-kube-api-access-zfqlj\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-ovs-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-dbus-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5shn\" (UniqueName: \"kubernetes.io/projected/8513d59a-88fb-4414-bfa8-e5fcfc599cca-kube-api-access-j5shn\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-nmstate-lock\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-ovs-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: E1205 08:36:42.457990 4795 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 05 08:36:42 crc kubenswrapper[4795]: E1205 08:36:42.458066 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair podName:8b96bb54-83d3-4c36-a347-bad33a85e746 nodeName:}" failed. No retries permitted until 2025-12-05 08:36:42.958042378 +0000 UTC m=+754.530646107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-6bqfk" (UID: "8b96bb54-83d3-4c36-a347-bad33a85e746") : secret "openshift-nmstate-webhook" not found Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.458230 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8513d59a-88fb-4414-bfa8-e5fcfc599cca-dbus-socket\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.457857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.458341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbn89\" (UniqueName: \"kubernetes.io/projected/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-kube-api-access-tbn89\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.486494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfqlj\" (UniqueName: \"kubernetes.io/projected/8b96bb54-83d3-4c36-a347-bad33a85e746-kube-api-access-zfqlj\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.507248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5shn\" (UniqueName: \"kubernetes.io/projected/8513d59a-88fb-4414-bfa8-e5fcfc599cca-kube-api-access-j5shn\") pod \"nmstate-handler-fhm2n\" (UID: \"8513d59a-88fb-4414-bfa8-e5fcfc599cca\") " pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.507420 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.524817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.559129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.559259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.559293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbn89\" (UniqueName: \"kubernetes.io/projected/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-kube-api-access-tbn89\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: E1205 08:36:42.559881 4795 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 05 08:36:42 crc kubenswrapper[4795]: E1205 08:36:42.560035 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert podName:e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6 nodeName:}" failed. No retries permitted until 2025-12-05 08:36:43.059997662 +0000 UTC m=+754.632601401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-c5s7r" (UID: "e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6") : secret "plugin-serving-cert" not found Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.561360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.590234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbn89\" (UniqueName: \"kubernetes.io/projected/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-kube-api-access-tbn89\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.631758 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.743126 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57cf76c794-fv9fk"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.744017 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.762287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.762473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-service-ca\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.762662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.762773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-oauth-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.762942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-oauth-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.763029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-trusted-ca-bundle\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.763106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6xbc\" (UniqueName: \"kubernetes.io/projected/d5b15de2-618a-4aad-886e-7ba7ba43307e-kube-api-access-z6xbc\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.897378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-oauth-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898234 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-oauth-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898274 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-trusted-ca-bundle\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898305 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6xbc\" (UniqueName: \"kubernetes.io/projected/d5b15de2-618a-4aad-886e-7ba7ba43307e-kube-api-access-z6xbc\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-service-ca\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.898859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-oauth-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.899365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.915122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cf76c794-fv9fk"] Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.916262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-service-ca\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.917100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-serving-cert\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.919993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5b15de2-618a-4aad-886e-7ba7ba43307e-console-oauth-config\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.975376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6xbc\" (UniqueName: \"kubernetes.io/projected/d5b15de2-618a-4aad-886e-7ba7ba43307e-kube-api-access-z6xbc\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:42 crc kubenswrapper[4795]: I1205 08:36:42.985384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5b15de2-618a-4aad-886e-7ba7ba43307e-trusted-ca-bundle\") pod \"console-57cf76c794-fv9fk\" (UID: \"d5b15de2-618a-4aad-886e-7ba7ba43307e\") " pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:42.999965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.013464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8b96bb54-83d3-4c36-a347-bad33a85e746-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-6bqfk\" (UID: \"8b96bb54-83d3-4c36-a347-bad33a85e746\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.066836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.101559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.106051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-c5s7r\" (UID: \"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.170971 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.213859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fhm2n" event={"ID":"8513d59a-88fb-4414-bfa8-e5fcfc599cca","Type":"ContainerStarted","Data":"2882a825b8ce56cbe4daeb49db5b7e4f0e1f31bb48e8c302d7041c0372686423"} Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.234570 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26"] Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.386166 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.494110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cf76c794-fv9fk"] Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.623283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk"] Dec 05 08:36:43 crc kubenswrapper[4795]: I1205 08:36:43.718443 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r"] Dec 05 08:36:44 crc kubenswrapper[4795]: I1205 08:36:44.222455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cf76c794-fv9fk" event={"ID":"d5b15de2-618a-4aad-886e-7ba7ba43307e","Type":"ContainerStarted","Data":"6d02a058f8ef4998613d527a28234ffa2a8bb975cee52bbd7269d3b3d0e945d7"} Dec 05 08:36:44 crc kubenswrapper[4795]: I1205 08:36:44.222526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cf76c794-fv9fk" event={"ID":"d5b15de2-618a-4aad-886e-7ba7ba43307e","Type":"ContainerStarted","Data":"91b55b37e72626b2c37e0d945fd321b5af48081278c2520510fbfb80576b3ba1"} Dec 05 08:36:44 crc kubenswrapper[4795]: I1205 08:36:44.227509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" event={"ID":"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6","Type":"ContainerStarted","Data":"136aa8180ac38b2dbfd23c2a52429b5515c6a55e48da8f85f3272bf44d4e87ea"} Dec 05 08:36:44 crc kubenswrapper[4795]: I1205 08:36:44.229072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" event={"ID":"8b96bb54-83d3-4c36-a347-bad33a85e746","Type":"ContainerStarted","Data":"5a80e5dc8f41c1f39c71d78398c411630ef37ee9ad10b81da82190c8ef8fe434"} Dec 05 08:36:44 crc kubenswrapper[4795]: I1205 08:36:44.230002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" event={"ID":"347909b2-eaf4-4b55-becf-cde638716053","Type":"ContainerStarted","Data":"0c6362aec924f2ad6d1ae2a54a9c5eb7169a1e2220c76b9b20e4b4067b5f0515"} Dec 05 08:36:46 crc kubenswrapper[4795]: I1205 08:36:46.471741 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.257495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" event={"ID":"347909b2-eaf4-4b55-becf-cde638716053","Type":"ContainerStarted","Data":"eee674f1cec26a341ea99084fcc4520b850d6bc015325532f37f44a36e8e924a"} Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.261071 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.261186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fhm2n" event={"ID":"8513d59a-88fb-4414-bfa8-e5fcfc599cca","Type":"ContainerStarted","Data":"559738753b861f5d924d6351d49c8624c8ff0772cf207168e47c0aacc84f05a7"} Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.262517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" event={"ID":"8b96bb54-83d3-4c36-a347-bad33a85e746","Type":"ContainerStarted","Data":"04e2690e2d302ed818ee6bd0777210858591320a49ec059abd4fdb386571f519"} Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.263051 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.281060 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fhm2n" podStartSLOduration=1.6042508359999998 podStartE2EDuration="6.281001262s" podCreationTimestamp="2025-12-05 08:36:42 +0000 UTC" firstStartedPulling="2025-12-05 08:36:42.668392334 +0000 UTC m=+754.240996073" lastFinishedPulling="2025-12-05 08:36:47.34514276 +0000 UTC m=+758.917746499" observedRunningTime="2025-12-05 08:36:48.279332618 +0000 UTC m=+759.851936417" watchObservedRunningTime="2025-12-05 08:36:48.281001262 +0000 UTC m=+759.853605001" Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.285449 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57cf76c794-fv9fk" podStartSLOduration=6.285401516 podStartE2EDuration="6.285401516s" podCreationTimestamp="2025-12-05 08:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:36:44.247541813 +0000 UTC m=+755.820145562" watchObservedRunningTime="2025-12-05 08:36:48.285401516 +0000 UTC m=+759.858005265" Dec 05 08:36:48 crc kubenswrapper[4795]: I1205 08:36:48.775302 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" podStartSLOduration=3.011798387 podStartE2EDuration="6.775275708s" podCreationTimestamp="2025-12-05 08:36:42 +0000 UTC" firstStartedPulling="2025-12-05 08:36:43.631517986 +0000 UTC m=+755.204121725" lastFinishedPulling="2025-12-05 08:36:47.394995307 +0000 UTC m=+758.967599046" observedRunningTime="2025-12-05 08:36:48.305371776 +0000 UTC m=+759.877975515" watchObservedRunningTime="2025-12-05 08:36:48.775275708 +0000 UTC m=+760.347879447" Dec 05 08:36:51 crc kubenswrapper[4795]: I1205 08:36:51.299155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" event={"ID":"347909b2-eaf4-4b55-becf-cde638716053","Type":"ContainerStarted","Data":"e02c49f36941dc259d0d61bf1106bb9fb141e4bef3c86b1f79d89e610e51e479"} Dec 05 08:36:51 crc kubenswrapper[4795]: I1205 08:36:51.330453 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4zm26" podStartSLOduration=2.223154958 podStartE2EDuration="9.330428604s" podCreationTimestamp="2025-12-05 08:36:42 +0000 UTC" firstStartedPulling="2025-12-05 08:36:43.249532493 +0000 UTC m=+754.822136232" lastFinishedPulling="2025-12-05 08:36:50.356806139 +0000 UTC m=+761.929409878" observedRunningTime="2025-12-05 08:36:51.325780312 +0000 UTC m=+762.898384051" watchObservedRunningTime="2025-12-05 08:36:51.330428604 +0000 UTC m=+762.903032333" Dec 05 08:36:52 crc kubenswrapper[4795]: I1205 08:36:52.671063 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fhm2n" Dec 05 08:36:53 crc kubenswrapper[4795]: I1205 08:36:53.067758 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:53 crc kubenswrapper[4795]: I1205 08:36:53.067845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:53 crc kubenswrapper[4795]: I1205 08:36:53.076144 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:53 crc kubenswrapper[4795]: I1205 08:36:53.314196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57cf76c794-fv9fk" Dec 05 08:36:53 crc kubenswrapper[4795]: I1205 08:36:53.410957 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:36:56 crc kubenswrapper[4795]: I1205 08:36:56.342922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" event={"ID":"e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6","Type":"ContainerStarted","Data":"4e2296f6f5cce93dc5132635f433dc3241261d9e3ca7a7c3f6cc13fb17a1d1bd"} Dec 05 08:36:56 crc kubenswrapper[4795]: I1205 08:36:56.368897 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-c5s7r" podStartSLOduration=2.32577945 podStartE2EDuration="14.368868707s" podCreationTimestamp="2025-12-05 08:36:42 +0000 UTC" firstStartedPulling="2025-12-05 08:36:43.734677172 +0000 UTC m=+755.307280911" lastFinishedPulling="2025-12-05 08:36:55.777766429 +0000 UTC m=+767.350370168" observedRunningTime="2025-12-05 08:36:56.367054469 +0000 UTC m=+767.939658238" watchObservedRunningTime="2025-12-05 08:36:56.368868707 +0000 UTC m=+767.941472476" Dec 05 08:37:03 crc kubenswrapper[4795]: I1205 08:37:03.179650 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-6bqfk" Dec 05 08:37:10 crc kubenswrapper[4795]: I1205 08:37:10.827489 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:37:10 crc kubenswrapper[4795]: I1205 08:37:10.829974 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.459976 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-r8zdl" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" containerID="cri-o://ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837" gracePeriod=15 Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.759185 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9"] Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.762043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.767445 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.767962 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9"] Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.868048 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-r8zdl_67c6f735-c0f7-4539-a2d4-0785b4238435/console/0.log" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.868380 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.887127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.887188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8bth\" (UniqueName: \"kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.887241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988692 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxdsw\" (UniqueName: \"kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.988854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca\") pod \"67c6f735-c0f7-4539-a2d4-0785b4238435\" (UID: \"67c6f735-c0f7-4539-a2d4-0785b4238435\") " Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.989159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.989196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8bth\" (UniqueName: \"kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.989242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.989861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.990199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.990273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config" (OuterVolumeSpecName: "console-config") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.990310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.990406 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.990738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca" (OuterVolumeSpecName: "service-ca") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.997893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:37:18 crc kubenswrapper[4795]: I1205 08:37:18.998358 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw" (OuterVolumeSpecName: "kube-api-access-rxdsw") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "kube-api-access-rxdsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.005880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "67c6f735-c0f7-4539-a2d4-0785b4238435" (UID: "67c6f735-c0f7-4539-a2d4-0785b4238435"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.009602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8bth\" (UniqueName: \"kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.090444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.090977 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxdsw\" (UniqueName: \"kubernetes.io/projected/67c6f735-c0f7-4539-a2d4-0785b4238435-kube-api-access-rxdsw\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091029 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091043 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-console-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091055 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091072 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-service-ca\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091086 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/67c6f735-c0f7-4539-a2d4-0785b4238435-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.091098 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/67c6f735-c0f7-4539-a2d4-0785b4238435-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.316230 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9"] Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.525523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerStarted","Data":"e8aad47ced65a318f0ecf9f17564133002b8a12818d57ba35a0075a27b7d5bbb"} Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.526651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerStarted","Data":"dadaec14f58b4a877bdbcd4eb02488a4c6a7ff711d9c90581a49258e800ae348"} Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.528529 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-r8zdl_67c6f735-c0f7-4539-a2d4-0785b4238435/console/0.log" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.528574 4795 generic.go:334] "Generic (PLEG): container finished" podID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerID="ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837" exitCode=2 Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.528628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r8zdl" event={"ID":"67c6f735-c0f7-4539-a2d4-0785b4238435","Type":"ContainerDied","Data":"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837"} Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.528662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-r8zdl" event={"ID":"67c6f735-c0f7-4539-a2d4-0785b4238435","Type":"ContainerDied","Data":"be1324b836f833629e0679c3440e0b9a7166ce0f051ba18af249f5603076b35e"} Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.528686 4795 scope.go:117] "RemoveContainer" containerID="ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.529112 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-r8zdl" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.608947 4795 scope.go:117] "RemoveContainer" containerID="ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837" Dec 05 08:37:19 crc kubenswrapper[4795]: E1205 08:37:19.609533 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837\": container with ID starting with ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837 not found: ID does not exist" containerID="ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.609583 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837"} err="failed to get container status \"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837\": rpc error: code = NotFound desc = could not find container \"ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837\": container with ID starting with ea85c21407f49ec8fa2f534884a4a2a23dcd5f40b40708395265f87b2a410837 not found: ID does not exist" Dec 05 08:37:19 crc kubenswrapper[4795]: E1205 08:37:19.622889 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c6f735_c0f7_4539_a2d4_0785b4238435.slice\": RecentStats: unable to find data in memory cache]" Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.630403 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:37:19 crc kubenswrapper[4795]: I1205 08:37:19.634074 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-r8zdl"] Dec 05 08:37:20 crc kubenswrapper[4795]: I1205 08:37:20.539124 4795 generic.go:334] "Generic (PLEG): container finished" podID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerID="e8aad47ced65a318f0ecf9f17564133002b8a12818d57ba35a0075a27b7d5bbb" exitCode=0 Dec 05 08:37:20 crc kubenswrapper[4795]: I1205 08:37:20.539235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerDied","Data":"e8aad47ced65a318f0ecf9f17564133002b8a12818d57ba35a0075a27b7d5bbb"} Dec 05 08:37:20 crc kubenswrapper[4795]: I1205 08:37:20.754258 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" path="/var/lib/kubelet/pods/67c6f735-c0f7-4539-a2d4-0785b4238435/volumes" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.085226 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:21 crc kubenswrapper[4795]: E1205 08:37:21.085509 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.085522 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.085686 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c6f735-c0f7-4539-a2d4-0785b4238435" containerName="console" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.086562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.098413 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.122531 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.122648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.122721 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2wc\" (UniqueName: \"kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.224424 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2wc\" (UniqueName: \"kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.224987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.225051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.225677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.225730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.250745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2wc\" (UniqueName: \"kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc\") pod \"redhat-operators-7x7bt\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:21 crc kubenswrapper[4795]: I1205 08:37:21.413637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:22 crc kubenswrapper[4795]: I1205 08:37:22.174668 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:22 crc kubenswrapper[4795]: W1205 08:37:22.228377 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20a6561d_2f5d_4646_8c63_34e88888e17e.slice/crio-95d855584e2177dc250a27bc58f908edb138249a54fcd70e509b35ea9afece32 WatchSource:0}: Error finding container 95d855584e2177dc250a27bc58f908edb138249a54fcd70e509b35ea9afece32: Status 404 returned error can't find the container with id 95d855584e2177dc250a27bc58f908edb138249a54fcd70e509b35ea9afece32 Dec 05 08:37:22 crc kubenswrapper[4795]: I1205 08:37:22.581446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerStarted","Data":"8c947452fc361e220e9c21da19501f07134948f13174728aa33dee3309ec5a1e"} Dec 05 08:37:22 crc kubenswrapper[4795]: I1205 08:37:22.588388 4795 generic.go:334] "Generic (PLEG): container finished" podID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerID="5f8c5ebd1fe9c315fbb2045df520904b41ca811b9f1642ff04296a291dbfa281" exitCode=0 Dec 05 08:37:22 crc kubenswrapper[4795]: I1205 08:37:22.588440 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerDied","Data":"5f8c5ebd1fe9c315fbb2045df520904b41ca811b9f1642ff04296a291dbfa281"} Dec 05 08:37:22 crc kubenswrapper[4795]: I1205 08:37:22.588470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerStarted","Data":"95d855584e2177dc250a27bc58f908edb138249a54fcd70e509b35ea9afece32"} Dec 05 08:37:23 crc kubenswrapper[4795]: I1205 08:37:23.597086 4795 generic.go:334] "Generic (PLEG): container finished" podID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerID="8c947452fc361e220e9c21da19501f07134948f13174728aa33dee3309ec5a1e" exitCode=0 Dec 05 08:37:23 crc kubenswrapper[4795]: I1205 08:37:23.597188 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerDied","Data":"8c947452fc361e220e9c21da19501f07134948f13174728aa33dee3309ec5a1e"} Dec 05 08:37:24 crc kubenswrapper[4795]: I1205 08:37:24.604510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerStarted","Data":"ba7dd31aee85763ef1f7c7607f124e8f8c134753b1a5a0b0fb4255f76d960190"} Dec 05 08:37:24 crc kubenswrapper[4795]: I1205 08:37:24.626968 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" podStartSLOduration=4.853603133 podStartE2EDuration="6.626945297s" podCreationTimestamp="2025-12-05 08:37:18 +0000 UTC" firstStartedPulling="2025-12-05 08:37:20.541937651 +0000 UTC m=+792.114541390" lastFinishedPulling="2025-12-05 08:37:22.315279815 +0000 UTC m=+793.887883554" observedRunningTime="2025-12-05 08:37:24.622434076 +0000 UTC m=+796.195037815" watchObservedRunningTime="2025-12-05 08:37:24.626945297 +0000 UTC m=+796.199549026" Dec 05 08:37:25 crc kubenswrapper[4795]: I1205 08:37:25.614402 4795 generic.go:334] "Generic (PLEG): container finished" podID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerID="ba7dd31aee85763ef1f7c7607f124e8f8c134753b1a5a0b0fb4255f76d960190" exitCode=0 Dec 05 08:37:25 crc kubenswrapper[4795]: I1205 08:37:25.614505 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerDied","Data":"ba7dd31aee85763ef1f7c7607f124e8f8c134753b1a5a0b0fb4255f76d960190"} Dec 05 08:37:25 crc kubenswrapper[4795]: I1205 08:37:25.619003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerStarted","Data":"d5fb37612fe11f21507cdd8012c02107444ca3ca3e30240f3e972c2fb8995568"} Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.216082 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.367488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle\") pod \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.367635 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8bth\" (UniqueName: \"kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth\") pod \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.367663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util\") pod \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\" (UID: \"3cd05f4a-f1f2-4d26-bbd8-1216247ed955\") " Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.368905 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle" (OuterVolumeSpecName: "bundle") pod "3cd05f4a-f1f2-4d26-bbd8-1216247ed955" (UID: "3cd05f4a-f1f2-4d26-bbd8-1216247ed955"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.379007 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util" (OuterVolumeSpecName: "util") pod "3cd05f4a-f1f2-4d26-bbd8-1216247ed955" (UID: "3cd05f4a-f1f2-4d26-bbd8-1216247ed955"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.382841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth" (OuterVolumeSpecName: "kube-api-access-r8bth") pod "3cd05f4a-f1f2-4d26-bbd8-1216247ed955" (UID: "3cd05f4a-f1f2-4d26-bbd8-1216247ed955"). InnerVolumeSpecName "kube-api-access-r8bth". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.469231 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.469268 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8bth\" (UniqueName: \"kubernetes.io/projected/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-kube-api-access-r8bth\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.469280 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cd05f4a-f1f2-4d26-bbd8-1216247ed955-util\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.636781 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" event={"ID":"3cd05f4a-f1f2-4d26-bbd8-1216247ed955","Type":"ContainerDied","Data":"dadaec14f58b4a877bdbcd4eb02488a4c6a7ff711d9c90581a49258e800ae348"} Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.636858 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dadaec14f58b4a877bdbcd4eb02488a4c6a7ff711d9c90581a49258e800ae348" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.636981 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9" Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.641693 4795 generic.go:334] "Generic (PLEG): container finished" podID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerID="d5fb37612fe11f21507cdd8012c02107444ca3ca3e30240f3e972c2fb8995568" exitCode=0 Dec 05 08:37:27 crc kubenswrapper[4795]: I1205 08:37:27.641781 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerDied","Data":"d5fb37612fe11f21507cdd8012c02107444ca3ca3e30240f3e972c2fb8995568"} Dec 05 08:37:28 crc kubenswrapper[4795]: I1205 08:37:28.661095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerStarted","Data":"2c1d7ade5e369d8bfc7ef4a19be2eace36036456e5250fd1837d2b5c009c1dca"} Dec 05 08:37:28 crc kubenswrapper[4795]: I1205 08:37:28.685415 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7x7bt" podStartSLOduration=2.1529198 podStartE2EDuration="7.685386744s" podCreationTimestamp="2025-12-05 08:37:21 +0000 UTC" firstStartedPulling="2025-12-05 08:37:22.590254804 +0000 UTC m=+794.162858543" lastFinishedPulling="2025-12-05 08:37:28.122721758 +0000 UTC m=+799.695325487" observedRunningTime="2025-12-05 08:37:28.680503724 +0000 UTC m=+800.253107463" watchObservedRunningTime="2025-12-05 08:37:28.685386744 +0000 UTC m=+800.257990483" Dec 05 08:37:31 crc kubenswrapper[4795]: I1205 08:37:31.414926 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:31 crc kubenswrapper[4795]: I1205 08:37:31.415430 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:32 crc kubenswrapper[4795]: I1205 08:37:32.530405 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7x7bt" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="registry-server" probeResult="failure" output=< Dec 05 08:37:32 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 08:37:32 crc kubenswrapper[4795]: > Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.009790 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6"] Dec 05 08:37:37 crc kubenswrapper[4795]: E1205 08:37:37.010852 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="util" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.010868 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="util" Dec 05 08:37:37 crc kubenswrapper[4795]: E1205 08:37:37.010887 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="extract" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.010895 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="extract" Dec 05 08:37:37 crc kubenswrapper[4795]: E1205 08:37:37.010906 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="pull" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.010914 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="pull" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.011042 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd05f4a-f1f2-4d26-bbd8-1216247ed955" containerName="extract" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.011552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.022090 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.031076 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.031423 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.032839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-27qfw" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.035278 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.058057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6"] Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.202059 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlnj\" (UniqueName: \"kubernetes.io/projected/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-kube-api-access-vqlnj\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.202166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-apiservice-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.202249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-webhook-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.303758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlnj\" (UniqueName: \"kubernetes.io/projected/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-kube-api-access-vqlnj\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.303867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-apiservice-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.303912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-webhook-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.311222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-apiservice-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.315341 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-webhook-cert\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.338382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlnj\" (UniqueName: \"kubernetes.io/projected/3a2de58b-f473-4a10-a1c3-1286a5a28aa3-kube-api-access-vqlnj\") pod \"metallb-operator-controller-manager-f466d6f9b-tnbv6\" (UID: \"3a2de58b-f473-4a10-a1c3-1286a5a28aa3\") " pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.363537 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7"] Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.364426 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.368981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.369880 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.371004 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-28rv9" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.393467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7"] Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.405739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-apiservice-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.405944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9q94\" (UniqueName: \"kubernetes.io/projected/b4285b69-fac1-4122-ae5c-8017dcd83316-kube-api-access-n9q94\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.406027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-webhook-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.507033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-apiservice-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.507135 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9q94\" (UniqueName: \"kubernetes.io/projected/b4285b69-fac1-4122-ae5c-8017dcd83316-kube-api-access-n9q94\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.507183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-webhook-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.510978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-webhook-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.524959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b4285b69-fac1-4122-ae5c-8017dcd83316-apiservice-cert\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.529489 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9q94\" (UniqueName: \"kubernetes.io/projected/b4285b69-fac1-4122-ae5c-8017dcd83316-kube-api-access-n9q94\") pod \"metallb-operator-webhook-server-6f86ff657c-whbk7\" (UID: \"b4285b69-fac1-4122-ae5c-8017dcd83316\") " pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.636800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:37 crc kubenswrapper[4795]: I1205 08:37:37.690004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:38 crc kubenswrapper[4795]: I1205 08:37:38.353742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6"] Dec 05 08:37:38 crc kubenswrapper[4795]: I1205 08:37:38.535377 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7"] Dec 05 08:37:38 crc kubenswrapper[4795]: I1205 08:37:38.964388 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" event={"ID":"3a2de58b-f473-4a10-a1c3-1286a5a28aa3","Type":"ContainerStarted","Data":"4e906ccee1a7b61eb4ce828924e9cf70db2c3bbd2d36f915f29a84a88f8fb913"} Dec 05 08:37:38 crc kubenswrapper[4795]: I1205 08:37:38.965638 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" event={"ID":"b4285b69-fac1-4122-ae5c-8017dcd83316","Type":"ContainerStarted","Data":"710ac44c944996c5bdca6660a9e155ff37e7fe39fa8e347cc1827dd5e60eed91"} Dec 05 08:37:40 crc kubenswrapper[4795]: I1205 08:37:40.827651 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:37:40 crc kubenswrapper[4795]: I1205 08:37:40.828103 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:37:41 crc kubenswrapper[4795]: I1205 08:37:41.571216 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:41 crc kubenswrapper[4795]: I1205 08:37:41.653315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:42 crc kubenswrapper[4795]: I1205 08:37:42.340635 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:43 crc kubenswrapper[4795]: I1205 08:37:42.998994 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7x7bt" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="registry-server" containerID="cri-o://2c1d7ade5e369d8bfc7ef4a19be2eace36036456e5250fd1837d2b5c009c1dca" gracePeriod=2 Dec 05 08:37:44 crc kubenswrapper[4795]: I1205 08:37:44.009063 4795 generic.go:334] "Generic (PLEG): container finished" podID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerID="2c1d7ade5e369d8bfc7ef4a19be2eace36036456e5250fd1837d2b5c009c1dca" exitCode=0 Dec 05 08:37:44 crc kubenswrapper[4795]: I1205 08:37:44.009123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerDied","Data":"2c1d7ade5e369d8bfc7ef4a19be2eace36036456e5250fd1837d2b5c009c1dca"} Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.407827 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.459005 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content\") pod \"20a6561d-2f5d-4646-8c63-34e88888e17e\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.459123 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities\") pod \"20a6561d-2f5d-4646-8c63-34e88888e17e\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.459159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl2wc\" (UniqueName: \"kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc\") pod \"20a6561d-2f5d-4646-8c63-34e88888e17e\" (UID: \"20a6561d-2f5d-4646-8c63-34e88888e17e\") " Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.467551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc" (OuterVolumeSpecName: "kube-api-access-cl2wc") pod "20a6561d-2f5d-4646-8c63-34e88888e17e" (UID: "20a6561d-2f5d-4646-8c63-34e88888e17e"). InnerVolumeSpecName "kube-api-access-cl2wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.468055 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities" (OuterVolumeSpecName: "utilities") pod "20a6561d-2f5d-4646-8c63-34e88888e17e" (UID: "20a6561d-2f5d-4646-8c63-34e88888e17e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.560666 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.560741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl2wc\" (UniqueName: \"kubernetes.io/projected/20a6561d-2f5d-4646-8c63-34e88888e17e-kube-api-access-cl2wc\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.658162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20a6561d-2f5d-4646-8c63-34e88888e17e" (UID: "20a6561d-2f5d-4646-8c63-34e88888e17e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:37:45 crc kubenswrapper[4795]: I1205 08:37:45.661462 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a6561d-2f5d-4646-8c63-34e88888e17e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.029668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" event={"ID":"3a2de58b-f473-4a10-a1c3-1286a5a28aa3","Type":"ContainerStarted","Data":"db6d637d2930754d61519414afd42e51cc1d155e563e502b41431a9c46819c3c"} Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.029872 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.034249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7x7bt" event={"ID":"20a6561d-2f5d-4646-8c63-34e88888e17e","Type":"ContainerDied","Data":"95d855584e2177dc250a27bc58f908edb138249a54fcd70e509b35ea9afece32"} Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.034319 4795 scope.go:117] "RemoveContainer" containerID="2c1d7ade5e369d8bfc7ef4a19be2eace36036456e5250fd1837d2b5c009c1dca" Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.034285 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7x7bt" Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.069082 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" podStartSLOduration=3.296154239 podStartE2EDuration="10.069057546s" podCreationTimestamp="2025-12-05 08:37:36 +0000 UTC" firstStartedPulling="2025-12-05 08:37:38.368830053 +0000 UTC m=+809.941433792" lastFinishedPulling="2025-12-05 08:37:45.14173336 +0000 UTC m=+816.714337099" observedRunningTime="2025-12-05 08:37:46.055946206 +0000 UTC m=+817.628549945" watchObservedRunningTime="2025-12-05 08:37:46.069057546 +0000 UTC m=+817.641661285" Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.089998 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.093569 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7x7bt"] Dec 05 08:37:46 crc kubenswrapper[4795]: I1205 08:37:46.758356 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" path="/var/lib/kubelet/pods/20a6561d-2f5d-4646-8c63-34e88888e17e/volumes" Dec 05 08:37:47 crc kubenswrapper[4795]: I1205 08:37:47.584641 4795 scope.go:117] "RemoveContainer" containerID="d5fb37612fe11f21507cdd8012c02107444ca3ca3e30240f3e972c2fb8995568" Dec 05 08:37:47 crc kubenswrapper[4795]: I1205 08:37:47.648387 4795 scope.go:117] "RemoveContainer" containerID="5f8c5ebd1fe9c315fbb2045df520904b41ca811b9f1642ff04296a291dbfa281" Dec 05 08:37:48 crc kubenswrapper[4795]: I1205 08:37:48.049061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" event={"ID":"b4285b69-fac1-4122-ae5c-8017dcd83316","Type":"ContainerStarted","Data":"0bae5c0a30c29a8c8b8eaacbd64d4dc0d6435a794203b024d5278a965590165b"} Dec 05 08:37:48 crc kubenswrapper[4795]: I1205 08:37:48.051250 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:37:48 crc kubenswrapper[4795]: I1205 08:37:48.074176 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" podStartSLOduration=1.924857593 podStartE2EDuration="11.074151497s" podCreationTimestamp="2025-12-05 08:37:37 +0000 UTC" firstStartedPulling="2025-12-05 08:37:38.550951957 +0000 UTC m=+810.123555696" lastFinishedPulling="2025-12-05 08:37:47.700245861 +0000 UTC m=+819.272849600" observedRunningTime="2025-12-05 08:37:48.070716775 +0000 UTC m=+819.643320514" watchObservedRunningTime="2025-12-05 08:37:48.074151497 +0000 UTC m=+819.646755246" Dec 05 08:37:57 crc kubenswrapper[4795]: I1205 08:37:57.699556 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6f86ff657c-whbk7" Dec 05 08:38:10 crc kubenswrapper[4795]: I1205 08:38:10.827871 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:38:10 crc kubenswrapper[4795]: I1205 08:38:10.828653 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:38:10 crc kubenswrapper[4795]: I1205 08:38:10.828730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:38:10 crc kubenswrapper[4795]: I1205 08:38:10.829568 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:38:10 crc kubenswrapper[4795]: I1205 08:38:10.829643 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312" gracePeriod=600 Dec 05 08:38:11 crc kubenswrapper[4795]: I1205 08:38:11.198703 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312" exitCode=0 Dec 05 08:38:11 crc kubenswrapper[4795]: I1205 08:38:11.198837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312"} Dec 05 08:38:11 crc kubenswrapper[4795]: I1205 08:38:11.198968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5"} Dec 05 08:38:11 crc kubenswrapper[4795]: I1205 08:38:11.198994 4795 scope.go:117] "RemoveContainer" containerID="8b70c489a5a763f3c299a750acd02daa110fa561056ff0ee76d5721bb0a9a168" Dec 05 08:38:17 crc kubenswrapper[4795]: I1205 08:38:17.643961 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-f466d6f9b-tnbv6" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.469252 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9m78n"] Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.469670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="extract-content" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.469696 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="extract-content" Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.469731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="extract-utilities" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.469738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="extract-utilities" Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.469756 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="registry-server" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.469763 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="registry-server" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.469894 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a6561d-2f5d-4646-8c63-34e88888e17e" containerName="registry-server" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.491021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.501908 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl"] Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.503211 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.510928 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.518763 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.519981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-rqjc7" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.520139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.521080 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl"] Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.588848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1652139-9bce-404b-a089-375e6023dc34-frr-startup\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.588912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-reloader\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.588950 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-conf\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.588975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.589007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-sockets\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.589024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.589049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-metrics\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.589072 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8d7v\" (UniqueName: \"kubernetes.io/projected/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-kube-api-access-c8d7v\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.589095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lgp\" (UniqueName: \"kubernetes.io/projected/e1652139-9bce-404b-a089-375e6023dc34-kube-api-access-l7lgp\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.612226 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jsnlr"] Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.614216 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.618248 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.618508 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.618752 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2gfqx" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.618964 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.650946 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-rzhjw"] Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.652180 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.654035 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.654440 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-rzhjw"] Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2wn\" (UniqueName: \"kubernetes.io/projected/3b63dece-6484-4464-b6a2-c8dcdbb34eae-kube-api-access-cw2wn\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-metrics\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8d7v\" (UniqueName: \"kubernetes.io/projected/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-kube-api-access-c8d7v\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lgp\" (UniqueName: \"kubernetes.io/projected/e1652139-9bce-404b-a089-375e6023dc34-kube-api-access-l7lgp\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metallb-excludel2\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-cert\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metrics-certs\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1652139-9bce-404b-a089-375e6023dc34-frr-startup\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-reloader\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-metrics-certs\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-conf\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.690988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.691022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.691053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8xwj\" (UniqueName: \"kubernetes.io/projected/a92afa97-4689-4d07-aae0-1294479b1198-kube-api-access-c8xwj\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.691076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-sockets\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.691677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-sockets\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.691820 4795 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.691895 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs podName:e1652139-9bce-404b-a089-375e6023dc34 nodeName:}" failed. No retries permitted until 2025-12-05 08:38:19.19187203 +0000 UTC m=+850.764475769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs") pod "frr-k8s-9m78n" (UID: "e1652139-9bce-404b-a089-375e6023dc34") : secret "frr-k8s-certs-secret" not found Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.692150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-metrics\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.692988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-frr-conf\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.693830 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e1652139-9bce-404b-a089-375e6023dc34-frr-startup\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.695947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e1652139-9bce-404b-a089-375e6023dc34-reloader\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.707115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.712248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8d7v\" (UniqueName: \"kubernetes.io/projected/2297e0a2-10ff-47d9-8acf-c94bf4bddc9f-kube-api-access-c8d7v\") pod \"frr-k8s-webhook-server-7fcb986d4-q56zl\" (UID: \"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.723271 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lgp\" (UniqueName: \"kubernetes.io/projected/e1652139-9bce-404b-a089-375e6023dc34-kube-api-access-l7lgp\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.792463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.792549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8xwj\" (UniqueName: \"kubernetes.io/projected/a92afa97-4689-4d07-aae0-1294479b1198-kube-api-access-c8xwj\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.792603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2wn\" (UniqueName: \"kubernetes.io/projected/3b63dece-6484-4464-b6a2-c8dcdbb34eae-kube-api-access-cw2wn\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.792805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metallb-excludel2\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.792810 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 08:38:18 crc kubenswrapper[4795]: E1205 08:38:18.792904 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist podName:3b63dece-6484-4464-b6a2-c8dcdbb34eae nodeName:}" failed. No retries permitted until 2025-12-05 08:38:19.292882332 +0000 UTC m=+850.865486071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist") pod "speaker-jsnlr" (UID: "3b63dece-6484-4464-b6a2-c8dcdbb34eae") : secret "metallb-memberlist" not found Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.792833 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-cert\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.793056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metrics-certs\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.793130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-metrics-certs\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.793910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metallb-excludel2\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.797055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-metrics-certs\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.797189 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.799129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-metrics-certs\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.806983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a92afa97-4689-4d07-aae0-1294479b1198-cert\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.814645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8xwj\" (UniqueName: \"kubernetes.io/projected/a92afa97-4689-4d07-aae0-1294479b1198-kube-api-access-c8xwj\") pod \"controller-f8648f98b-rzhjw\" (UID: \"a92afa97-4689-4d07-aae0-1294479b1198\") " pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.819404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2wn\" (UniqueName: \"kubernetes.io/projected/3b63dece-6484-4464-b6a2-c8dcdbb34eae-kube-api-access-cw2wn\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.862953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:18 crc kubenswrapper[4795]: I1205 08:38:18.989558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.202200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.209874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1652139-9bce-404b-a089-375e6023dc34-metrics-certs\") pod \"frr-k8s-9m78n\" (UID: \"e1652139-9bce-404b-a089-375e6023dc34\") " pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.264704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-rzhjw"] Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.303358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:19 crc kubenswrapper[4795]: E1205 08:38:19.303592 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 05 08:38:19 crc kubenswrapper[4795]: E1205 08:38:19.303698 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist podName:3b63dece-6484-4464-b6a2-c8dcdbb34eae nodeName:}" failed. No retries permitted until 2025-12-05 08:38:20.303674767 +0000 UTC m=+851.876278506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist") pod "speaker-jsnlr" (UID: "3b63dece-6484-4464-b6a2-c8dcdbb34eae") : secret "metallb-memberlist" not found Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.376320 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl"] Dec 05 08:38:19 crc kubenswrapper[4795]: W1205 08:38:19.380159 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2297e0a2_10ff_47d9_8acf_c94bf4bddc9f.slice/crio-72e880e226bd0521bcacee03ea4a96ee7ea903bd5fd83621a83698a0f48f31e0 WatchSource:0}: Error finding container 72e880e226bd0521bcacee03ea4a96ee7ea903bd5fd83621a83698a0f48f31e0: Status 404 returned error can't find the container with id 72e880e226bd0521bcacee03ea4a96ee7ea903bd5fd83621a83698a0f48f31e0 Dec 05 08:38:19 crc kubenswrapper[4795]: I1205 08:38:19.447998 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.268419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rzhjw" event={"ID":"a92afa97-4689-4d07-aae0-1294479b1198","Type":"ContainerStarted","Data":"a9de53ba7b394041a94a51bcbdd2587db9a8825fdf460d42653d94a4ea31da69"} Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.268982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.269013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rzhjw" event={"ID":"a92afa97-4689-4d07-aae0-1294479b1198","Type":"ContainerStarted","Data":"a3fffc8985053f77e493034b26c0a2aac402d7ca57427e6b393a1bef18d4470b"} Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.269033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-rzhjw" event={"ID":"a92afa97-4689-4d07-aae0-1294479b1198","Type":"ContainerStarted","Data":"59ea69b93101e09748d053dadca42950fe64f485fc4a1522bf71101c750ba83c"} Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.269482 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" event={"ID":"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f","Type":"ContainerStarted","Data":"72e880e226bd0521bcacee03ea4a96ee7ea903bd5fd83621a83698a0f48f31e0"} Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.270738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"456bcb3356c638d9a162faf27b98b1c7a3e74b308083ed5c14dabc2dcb4fdcc5"} Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.295583 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-rzhjw" podStartSLOduration=2.295557423 podStartE2EDuration="2.295557423s" podCreationTimestamp="2025-12-05 08:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:38:20.291316619 +0000 UTC m=+851.863920358" watchObservedRunningTime="2025-12-05 08:38:20.295557423 +0000 UTC m=+851.868161172" Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.322200 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.328994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3b63dece-6484-4464-b6a2-c8dcdbb34eae-memberlist\") pod \"speaker-jsnlr\" (UID: \"3b63dece-6484-4464-b6a2-c8dcdbb34eae\") " pod="metallb-system/speaker-jsnlr" Dec 05 08:38:20 crc kubenswrapper[4795]: I1205 08:38:20.431416 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jsnlr" Dec 05 08:38:21 crc kubenswrapper[4795]: I1205 08:38:21.278887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jsnlr" event={"ID":"3b63dece-6484-4464-b6a2-c8dcdbb34eae","Type":"ContainerStarted","Data":"8530b3e3ef433672502c0df456abe4d3e4572a2b5bfa18d47e2e7e60c22f6288"} Dec 05 08:38:21 crc kubenswrapper[4795]: I1205 08:38:21.279659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jsnlr" event={"ID":"3b63dece-6484-4464-b6a2-c8dcdbb34eae","Type":"ContainerStarted","Data":"de8a1ba173639b8bae14d58b0f254bec2c0cb5d4a896825f434c1e173d5f4d64"} Dec 05 08:38:21 crc kubenswrapper[4795]: I1205 08:38:21.279675 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jsnlr" event={"ID":"3b63dece-6484-4464-b6a2-c8dcdbb34eae","Type":"ContainerStarted","Data":"7ce5f28310dd602a62c83a6a1323569ce9c89409f724a571fce4c9cb4f8a0efe"} Dec 05 08:38:21 crc kubenswrapper[4795]: I1205 08:38:21.279857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jsnlr" Dec 05 08:38:21 crc kubenswrapper[4795]: I1205 08:38:21.303136 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jsnlr" podStartSLOduration=3.303101056 podStartE2EDuration="3.303101056s" podCreationTimestamp="2025-12-05 08:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:38:21.301198996 +0000 UTC m=+852.873802735" watchObservedRunningTime="2025-12-05 08:38:21.303101056 +0000 UTC m=+852.875704795" Dec 05 08:38:30 crc kubenswrapper[4795]: I1205 08:38:30.455976 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jsnlr" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.234747 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.237230 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.239644 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.240048 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-rt56v" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.240273 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.247346 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.338106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrt9r\" (UniqueName: \"kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r\") pod \"openstack-operator-index-hxdh6\" (UID: \"ff4666ab-54d0-4103-8e22-ceef6067c59c\") " pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.439462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrt9r\" (UniqueName: \"kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r\") pod \"openstack-operator-index-hxdh6\" (UID: \"ff4666ab-54d0-4103-8e22-ceef6067c59c\") " pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.465944 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrt9r\" (UniqueName: \"kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r\") pod \"openstack-operator-index-hxdh6\" (UID: \"ff4666ab-54d0-4103-8e22-ceef6067c59c\") " pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.555453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.630585 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1652139-9bce-404b-a089-375e6023dc34" containerID="0f59ead729f1857d25b75a2fd43f4999ddbd747473a1996786f8ca438eeb452d" exitCode=0 Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.630908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerDied","Data":"0f59ead729f1857d25b75a2fd43f4999ddbd747473a1996786f8ca438eeb452d"} Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.642592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" event={"ID":"2297e0a2-10ff-47d9-8acf-c94bf4bddc9f","Type":"ContainerStarted","Data":"770c05315d767a003ee199e36bf25cf9ac13ca367e071cfd13d4e78753937f82"} Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.642847 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:34 crc kubenswrapper[4795]: I1205 08:38:34.787390 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" podStartSLOduration=2.577673022 podStartE2EDuration="16.787371179s" podCreationTimestamp="2025-12-05 08:38:18 +0000 UTC" firstStartedPulling="2025-12-05 08:38:19.382945059 +0000 UTC m=+850.955548798" lastFinishedPulling="2025-12-05 08:38:33.592643216 +0000 UTC m=+865.165246955" observedRunningTime="2025-12-05 08:38:34.785458588 +0000 UTC m=+866.358062327" watchObservedRunningTime="2025-12-05 08:38:34.787371179 +0000 UTC m=+866.359974918" Dec 05 08:38:35 crc kubenswrapper[4795]: I1205 08:38:35.048317 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:35 crc kubenswrapper[4795]: W1205 08:38:35.065308 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff4666ab_54d0_4103_8e22_ceef6067c59c.slice/crio-b522685481c15fe47e7929bd4c97a86d8d3771d327cc1066f76ad543e669851f WatchSource:0}: Error finding container b522685481c15fe47e7929bd4c97a86d8d3771d327cc1066f76ad543e669851f: Status 404 returned error can't find the container with id b522685481c15fe47e7929bd4c97a86d8d3771d327cc1066f76ad543e669851f Dec 05 08:38:35 crc kubenswrapper[4795]: I1205 08:38:35.758829 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hxdh6" event={"ID":"ff4666ab-54d0-4103-8e22-ceef6067c59c","Type":"ContainerStarted","Data":"b522685481c15fe47e7929bd4c97a86d8d3771d327cc1066f76ad543e669851f"} Dec 05 08:38:35 crc kubenswrapper[4795]: I1205 08:38:35.768013 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1652139-9bce-404b-a089-375e6023dc34" containerID="64ce4df30cf264d286fd74a37db682e50141210d4fd15b06fadba959916a2f9c" exitCode=0 Dec 05 08:38:35 crc kubenswrapper[4795]: I1205 08:38:35.768985 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerDied","Data":"64ce4df30cf264d286fd74a37db682e50141210d4fd15b06fadba959916a2f9c"} Dec 05 08:38:37 crc kubenswrapper[4795]: I1205 08:38:37.964600 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.576011 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-q5zwr"] Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.576878 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.585373 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q5zwr"] Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.586100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm85j\" (UniqueName: \"kubernetes.io/projected/74cd8f10-9003-46be-992c-2b23202839bb-kube-api-access-dm85j\") pod \"openstack-operator-index-q5zwr\" (UID: \"74cd8f10-9003-46be-992c-2b23202839bb\") " pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.687587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm85j\" (UniqueName: \"kubernetes.io/projected/74cd8f10-9003-46be-992c-2b23202839bb-kube-api-access-dm85j\") pod \"openstack-operator-index-q5zwr\" (UID: \"74cd8f10-9003-46be-992c-2b23202839bb\") " pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.717528 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm85j\" (UniqueName: \"kubernetes.io/projected/74cd8f10-9003-46be-992c-2b23202839bb-kube-api-access-dm85j\") pod \"openstack-operator-index-q5zwr\" (UID: \"74cd8f10-9003-46be-992c-2b23202839bb\") " pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.792926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hxdh6" event={"ID":"ff4666ab-54d0-4103-8e22-ceef6067c59c","Type":"ContainerStarted","Data":"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008"} Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.796959 4795 generic.go:334] "Generic (PLEG): container finished" podID="e1652139-9bce-404b-a089-375e6023dc34" containerID="b333c6cb86e84154d2d64f9f5f5941d8f46f1976ca57216c60bfc0754e55ab99" exitCode=0 Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.797022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerDied","Data":"b333c6cb86e84154d2d64f9f5f5941d8f46f1976ca57216c60bfc0754e55ab99"} Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.858389 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hxdh6" podStartSLOduration=2.136004223 podStartE2EDuration="4.858359252s" podCreationTimestamp="2025-12-05 08:38:34 +0000 UTC" firstStartedPulling="2025-12-05 08:38:35.068218024 +0000 UTC m=+866.640821763" lastFinishedPulling="2025-12-05 08:38:37.790573053 +0000 UTC m=+869.363176792" observedRunningTime="2025-12-05 08:38:38.820208054 +0000 UTC m=+870.392811783" watchObservedRunningTime="2025-12-05 08:38:38.858359252 +0000 UTC m=+870.430962991" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.899456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:38 crc kubenswrapper[4795]: I1205 08:38:38.995553 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-rzhjw" Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.583354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-q5zwr"] Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.810158 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"75e990ad07dfc751ec11ffc760afc5119798a96024035ab50e0047af9f0c4f86"} Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.810211 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"34bd1038241a7bce537e41519651f65a742f32069872139f4e359ae11e782056"} Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.810225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"4d0ec845d500ce0638da39d3b8f0aed6ee60b5a4c882785ec4c046098e2c9db3"} Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.811978 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hxdh6" podUID="ff4666ab-54d0-4103-8e22-ceef6067c59c" containerName="registry-server" containerID="cri-o://bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008" gracePeriod=2 Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.812076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q5zwr" event={"ID":"74cd8f10-9003-46be-992c-2b23202839bb","Type":"ContainerStarted","Data":"d58c33d0d4d93fea347cbd379527a012b2c19af6c1490e4581ebfd55ee175c44"} Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.812097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-q5zwr" event={"ID":"74cd8f10-9003-46be-992c-2b23202839bb","Type":"ContainerStarted","Data":"336aae78c7905ea5314a6fc1ee7a7d0e6d404ac7e77b3a177e152dbe18a8f6da"} Dec 05 08:38:39 crc kubenswrapper[4795]: I1205 08:38:39.838987 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-q5zwr" podStartSLOduration=1.7773706059999999 podStartE2EDuration="1.838957667s" podCreationTimestamp="2025-12-05 08:38:38 +0000 UTC" firstStartedPulling="2025-12-05 08:38:39.607463967 +0000 UTC m=+871.180067706" lastFinishedPulling="2025-12-05 08:38:39.669051028 +0000 UTC m=+871.241654767" observedRunningTime="2025-12-05 08:38:39.83420647 +0000 UTC m=+871.406810209" watchObservedRunningTime="2025-12-05 08:38:39.838957667 +0000 UTC m=+871.411561406" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.341962 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.510917 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrt9r\" (UniqueName: \"kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r\") pod \"ff4666ab-54d0-4103-8e22-ceef6067c59c\" (UID: \"ff4666ab-54d0-4103-8e22-ceef6067c59c\") " Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.516870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r" (OuterVolumeSpecName: "kube-api-access-hrt9r") pod "ff4666ab-54d0-4103-8e22-ceef6067c59c" (UID: "ff4666ab-54d0-4103-8e22-ceef6067c59c"). InnerVolumeSpecName "kube-api-access-hrt9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.613008 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrt9r\" (UniqueName: \"kubernetes.io/projected/ff4666ab-54d0-4103-8e22-ceef6067c59c-kube-api-access-hrt9r\") on node \"crc\" DevicePath \"\"" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.821293 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff4666ab-54d0-4103-8e22-ceef6067c59c" containerID="bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008" exitCode=0 Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.821389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hxdh6" event={"ID":"ff4666ab-54d0-4103-8e22-ceef6067c59c","Type":"ContainerDied","Data":"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008"} Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.821454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hxdh6" event={"ID":"ff4666ab-54d0-4103-8e22-ceef6067c59c","Type":"ContainerDied","Data":"b522685481c15fe47e7929bd4c97a86d8d3771d327cc1066f76ad543e669851f"} Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.821487 4795 scope.go:117] "RemoveContainer" containerID="bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.821765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hxdh6" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.831405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"1bf9597278488e9909e67daf62c6a784bae298ed90d1d4a0257f7de578edb53c"} Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.831455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"74f75d32d69c3aea074f7482ff7bb6bab6d8c54392f13de2f54ef2573307b477"} Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.831476 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.831490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9m78n" event={"ID":"e1652139-9bce-404b-a089-375e6023dc34","Type":"ContainerStarted","Data":"70f3921987beff77355fb117fd81a169597037082e514bd04ed6c53d05f278c5"} Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.846392 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.851172 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hxdh6"] Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.872681 4795 scope.go:117] "RemoveContainer" containerID="bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008" Dec 05 08:38:40 crc kubenswrapper[4795]: E1205 08:38:40.873411 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008\": container with ID starting with bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008 not found: ID does not exist" containerID="bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.873451 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008"} err="failed to get container status \"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008\": rpc error: code = NotFound desc = could not find container \"bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008\": container with ID starting with bf21be4fbe2ccffe39172e306e1aa0d2fa9eb9b250f669a9950732eb267f1008 not found: ID does not exist" Dec 05 08:38:40 crc kubenswrapper[4795]: I1205 08:38:40.870775 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9m78n" podStartSLOduration=8.973572609 podStartE2EDuration="22.870755307s" podCreationTimestamp="2025-12-05 08:38:18 +0000 UTC" firstStartedPulling="2025-12-05 08:38:19.711910456 +0000 UTC m=+851.284514195" lastFinishedPulling="2025-12-05 08:38:33.609093164 +0000 UTC m=+865.181696893" observedRunningTime="2025-12-05 08:38:40.867003668 +0000 UTC m=+872.439607407" watchObservedRunningTime="2025-12-05 08:38:40.870755307 +0000 UTC m=+872.443359046" Dec 05 08:38:42 crc kubenswrapper[4795]: I1205 08:38:42.754959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff4666ab-54d0-4103-8e22-ceef6067c59c" path="/var/lib/kubelet/pods/ff4666ab-54d0-4103-8e22-ceef6067c59c/volumes" Dec 05 08:38:44 crc kubenswrapper[4795]: I1205 08:38:44.449111 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:44 crc kubenswrapper[4795]: I1205 08:38:44.488930 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.782308 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:38:45 crc kubenswrapper[4795]: E1205 08:38:45.782714 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff4666ab-54d0-4103-8e22-ceef6067c59c" containerName="registry-server" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.782738 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff4666ab-54d0-4103-8e22-ceef6067c59c" containerName="registry-server" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.782917 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff4666ab-54d0-4103-8e22-ceef6067c59c" containerName="registry-server" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.784002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.798458 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.903549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.903686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgx45\" (UniqueName: \"kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:45 crc kubenswrapper[4795]: I1205 08:38:45.903711 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.004425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.005134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.006030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgx45\" (UniqueName: \"kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.006148 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.006468 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.026726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgx45\" (UniqueName: \"kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45\") pod \"redhat-marketplace-2m8sk\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.118539 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:46 crc kubenswrapper[4795]: I1205 08:38:46.955471 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:38:47 crc kubenswrapper[4795]: I1205 08:38:47.883800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerStarted","Data":"f01eaf2ec415a3c6b3a0f35137739add0632be4afa8991f9666d57d2e14273e4"} Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.869463 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.897480 4795 generic.go:334] "Generic (PLEG): container finished" podID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerID="0632fdc0318a531f12e72de0b9794a8c04674111d045ebce05ad344a56b33f12" exitCode=0 Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.897570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerDied","Data":"0632fdc0318a531f12e72de0b9794a8c04674111d045ebce05ad344a56b33f12"} Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.899672 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.899725 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:48 crc kubenswrapper[4795]: I1205 08:38:48.951274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:49 crc kubenswrapper[4795]: I1205 08:38:49.450974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9m78n" Dec 05 08:38:49 crc kubenswrapper[4795]: I1205 08:38:49.970948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerStarted","Data":"59136305fd1058486df9521531f4959e70ae74a2ca46438e74d9130abad786c2"} Dec 05 08:38:50 crc kubenswrapper[4795]: I1205 08:38:50.141352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-q5zwr" Dec 05 08:38:50 crc kubenswrapper[4795]: I1205 08:38:50.981842 4795 generic.go:334] "Generic (PLEG): container finished" podID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerID="59136305fd1058486df9521531f4959e70ae74a2ca46438e74d9130abad786c2" exitCode=0 Dec 05 08:38:50 crc kubenswrapper[4795]: I1205 08:38:50.982000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerDied","Data":"59136305fd1058486df9521531f4959e70ae74a2ca46438e74d9130abad786c2"} Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.775193 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.778389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.802838 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.828777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.828889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg6g\" (UniqueName: \"kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.828960 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.929920 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.930423 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.930544 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg6g\" (UniqueName: \"kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.931584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.932053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:53 crc kubenswrapper[4795]: I1205 08:38:53.951012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg6g\" (UniqueName: \"kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g\") pod \"community-operators-gpsjn\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:54 crc kubenswrapper[4795]: I1205 08:38:54.005758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerStarted","Data":"7bba1bafb156148fcd8b7c8758fb509be8bd7485951c6f325bcba1d144733dcd"} Dec 05 08:38:54 crc kubenswrapper[4795]: I1205 08:38:54.094875 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:38:54 crc kubenswrapper[4795]: I1205 08:38:54.886654 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2m8sk" podStartSLOduration=5.442756757 podStartE2EDuration="9.886627338s" podCreationTimestamp="2025-12-05 08:38:45 +0000 UTC" firstStartedPulling="2025-12-05 08:38:48.89928569 +0000 UTC m=+880.471889439" lastFinishedPulling="2025-12-05 08:38:53.343156281 +0000 UTC m=+884.915760020" observedRunningTime="2025-12-05 08:38:54.03766434 +0000 UTC m=+885.610268079" watchObservedRunningTime="2025-12-05 08:38:54.886627338 +0000 UTC m=+886.459231077" Dec 05 08:38:54 crc kubenswrapper[4795]: I1205 08:38:54.889715 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:38:54 crc kubenswrapper[4795]: W1205 08:38:54.896057 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4469ae0_55f4_4cd1_9711_01aadd1377b7.slice/crio-d448babd3ff8c9a1a8e713d402cd9373007d76d4c408d2fc7d52af8e27d3fe91 WatchSource:0}: Error finding container d448babd3ff8c9a1a8e713d402cd9373007d76d4c408d2fc7d52af8e27d3fe91: Status 404 returned error can't find the container with id d448babd3ff8c9a1a8e713d402cd9373007d76d4c408d2fc7d52af8e27d3fe91 Dec 05 08:38:55 crc kubenswrapper[4795]: I1205 08:38:55.059565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerStarted","Data":"d448babd3ff8c9a1a8e713d402cd9373007d76d4c408d2fc7d52af8e27d3fe91"} Dec 05 08:38:56 crc kubenswrapper[4795]: I1205 08:38:56.067152 4795 generic.go:334] "Generic (PLEG): container finished" podID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerID="54246434e3c5973c18593eb29ce6f9e5e9cfc5b8266b090b30b0143a149c0878" exitCode=0 Dec 05 08:38:56 crc kubenswrapper[4795]: I1205 08:38:56.067216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerDied","Data":"54246434e3c5973c18593eb29ce6f9e5e9cfc5b8266b090b30b0143a149c0878"} Dec 05 08:38:56 crc kubenswrapper[4795]: I1205 08:38:56.120325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:56 crc kubenswrapper[4795]: I1205 08:38:56.120900 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:56 crc kubenswrapper[4795]: I1205 08:38:56.188817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.079426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerStarted","Data":"ed5760fe77f8567511d35a8316e1f124a36a8b531bab2eb8e15bb00c7349de2f"} Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.502219 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf"] Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.503654 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.505907 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf"] Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.509562 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ctmsm" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.597100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmvr\" (UniqueName: \"kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.597173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.597198 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.724568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.725091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.725171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmvr\" (UniqueName: \"kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.725265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.725524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.765497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmvr\" (UniqueName: \"kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr\") pod \"3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:57 crc kubenswrapper[4795]: I1205 08:38:57.821001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:38:58 crc kubenswrapper[4795]: I1205 08:38:58.315509 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf"] Dec 05 08:38:59 crc kubenswrapper[4795]: I1205 08:38:59.090585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerStarted","Data":"7df7333af478df58ce28c3a5e3e0243dd2fe7831772338fe502fc8511c9395b7"} Dec 05 08:38:59 crc kubenswrapper[4795]: I1205 08:38:59.090667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerStarted","Data":"361093768d36b4d20a43acd58bc53cce486f4b7fd463129afd302a1f5a82216d"} Dec 05 08:38:59 crc kubenswrapper[4795]: I1205 08:38:59.114743 4795 generic.go:334] "Generic (PLEG): container finished" podID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerID="ed5760fe77f8567511d35a8316e1f124a36a8b531bab2eb8e15bb00c7349de2f" exitCode=0 Dec 05 08:38:59 crc kubenswrapper[4795]: I1205 08:38:59.114819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerDied","Data":"ed5760fe77f8567511d35a8316e1f124a36a8b531bab2eb8e15bb00c7349de2f"} Dec 05 08:39:00 crc kubenswrapper[4795]: I1205 08:39:00.124095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerStarted","Data":"56b788b26ededc4d57d5518030f478f35f84e33a31dd4fdb41c20a76b5917d09"} Dec 05 08:39:00 crc kubenswrapper[4795]: I1205 08:39:00.129466 4795 generic.go:334] "Generic (PLEG): container finished" podID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerID="7df7333af478df58ce28c3a5e3e0243dd2fe7831772338fe502fc8511c9395b7" exitCode=0 Dec 05 08:39:00 crc kubenswrapper[4795]: I1205 08:39:00.129554 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerDied","Data":"7df7333af478df58ce28c3a5e3e0243dd2fe7831772338fe502fc8511c9395b7"} Dec 05 08:39:00 crc kubenswrapper[4795]: I1205 08:39:00.153422 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gpsjn" podStartSLOduration=3.7167391260000002 podStartE2EDuration="7.153395342s" podCreationTimestamp="2025-12-05 08:38:53 +0000 UTC" firstStartedPulling="2025-12-05 08:38:56.071095627 +0000 UTC m=+887.643699366" lastFinishedPulling="2025-12-05 08:38:59.507751853 +0000 UTC m=+891.080355582" observedRunningTime="2025-12-05 08:39:00.149032445 +0000 UTC m=+891.721636214" watchObservedRunningTime="2025-12-05 08:39:00.153395342 +0000 UTC m=+891.725999091" Dec 05 08:39:01 crc kubenswrapper[4795]: I1205 08:39:01.136110 4795 generic.go:334] "Generic (PLEG): container finished" podID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerID="f10530e5b4f858a4029c0c19a14935f1679bbb3a3aebff8f34d3bd8638c36cef" exitCode=0 Dec 05 08:39:01 crc kubenswrapper[4795]: I1205 08:39:01.136337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerDied","Data":"f10530e5b4f858a4029c0c19a14935f1679bbb3a3aebff8f34d3bd8638c36cef"} Dec 05 08:39:02 crc kubenswrapper[4795]: I1205 08:39:02.144094 4795 generic.go:334] "Generic (PLEG): container finished" podID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerID="9c30eb351c59e64da0d5706c1121f4255044d2521c37f37c582c788ca66cd4e1" exitCode=0 Dec 05 08:39:02 crc kubenswrapper[4795]: I1205 08:39:02.144513 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerDied","Data":"9c30eb351c59e64da0d5706c1121f4255044d2521c37f37c582c788ca66cd4e1"} Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.512521 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.520854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle\") pod \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.520912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util\") pod \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.520962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tmvr\" (UniqueName: \"kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr\") pod \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\" (UID: \"2c87c120-561f-4ce2-b47c-b99fb3ea4283\") " Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.521811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle" (OuterVolumeSpecName: "bundle") pod "2c87c120-561f-4ce2-b47c-b99fb3ea4283" (UID: "2c87c120-561f-4ce2-b47c-b99fb3ea4283"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.528949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr" (OuterVolumeSpecName: "kube-api-access-9tmvr") pod "2c87c120-561f-4ce2-b47c-b99fb3ea4283" (UID: "2c87c120-561f-4ce2-b47c-b99fb3ea4283"). InnerVolumeSpecName "kube-api-access-9tmvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.549550 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util" (OuterVolumeSpecName: "util") pod "2c87c120-561f-4ce2-b47c-b99fb3ea4283" (UID: "2c87c120-561f-4ce2-b47c-b99fb3ea4283"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.622498 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tmvr\" (UniqueName: \"kubernetes.io/projected/2c87c120-561f-4ce2-b47c-b99fb3ea4283-kube-api-access-9tmvr\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.622537 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:03 crc kubenswrapper[4795]: I1205 08:39:03.622546 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c87c120-561f-4ce2-b47c-b99fb3ea4283-util\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.096823 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.096878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.144842 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.162683 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.162739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf" event={"ID":"2c87c120-561f-4ce2-b47c-b99fb3ea4283","Type":"ContainerDied","Data":"361093768d36b4d20a43acd58bc53cce486f4b7fd463129afd302a1f5a82216d"} Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.162773 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361093768d36b4d20a43acd58bc53cce486f4b7fd463129afd302a1f5a82216d" Dec 05 08:39:04 crc kubenswrapper[4795]: I1205 08:39:04.225452 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:05 crc kubenswrapper[4795]: I1205 08:39:05.163261 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:39:06 crc kubenswrapper[4795]: I1205 08:39:06.177200 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gpsjn" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="registry-server" containerID="cri-o://56b788b26ededc4d57d5518030f478f35f84e33a31dd4fdb41c20a76b5917d09" gracePeriod=2 Dec 05 08:39:06 crc kubenswrapper[4795]: I1205 08:39:06.179334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.188550 4795 generic.go:334] "Generic (PLEG): container finished" podID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerID="56b788b26ededc4d57d5518030f478f35f84e33a31dd4fdb41c20a76b5917d09" exitCode=0 Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.188601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerDied","Data":"56b788b26ededc4d57d5518030f478f35f84e33a31dd4fdb41c20a76b5917d09"} Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.869708 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.993452 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities\") pod \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.993575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjg6g\" (UniqueName: \"kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g\") pod \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.993674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content\") pod \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\" (UID: \"b4469ae0-55f4-4cd1-9711-01aadd1377b7\") " Dec 05 08:39:07 crc kubenswrapper[4795]: I1205 08:39:07.994947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities" (OuterVolumeSpecName: "utilities") pod "b4469ae0-55f4-4cd1-9711-01aadd1377b7" (UID: "b4469ae0-55f4-4cd1-9711-01aadd1377b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.000534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g" (OuterVolumeSpecName: "kube-api-access-mjg6g") pod "b4469ae0-55f4-4cd1-9711-01aadd1377b7" (UID: "b4469ae0-55f4-4cd1-9711-01aadd1377b7"). InnerVolumeSpecName "kube-api-access-mjg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.047295 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4469ae0-55f4-4cd1-9711-01aadd1377b7" (UID: "b4469ae0-55f4-4cd1-9711-01aadd1377b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.095847 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.095890 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjg6g\" (UniqueName: \"kubernetes.io/projected/b4469ae0-55f4-4cd1-9711-01aadd1377b7-kube-api-access-mjg6g\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.095902 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4469ae0-55f4-4cd1-9711-01aadd1377b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.200594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gpsjn" event={"ID":"b4469ae0-55f4-4cd1-9711-01aadd1377b7","Type":"ContainerDied","Data":"d448babd3ff8c9a1a8e713d402cd9373007d76d4c408d2fc7d52af8e27d3fe91"} Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.200729 4795 scope.go:117] "RemoveContainer" containerID="56b788b26ededc4d57d5518030f478f35f84e33a31dd4fdb41c20a76b5917d09" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.200746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gpsjn" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.222931 4795 scope.go:117] "RemoveContainer" containerID="ed5760fe77f8567511d35a8316e1f124a36a8b531bab2eb8e15bb00c7349de2f" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.255699 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.260921 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gpsjn"] Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.265405 4795 scope.go:117] "RemoveContainer" containerID="54246434e3c5973c18593eb29ce6f9e5e9cfc5b8266b090b30b0143a149c0878" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.410770 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb"] Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411520 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="extract" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411545 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="extract" Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411557 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="extract-utilities" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411565 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="extract-utilities" Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411579 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="util" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411586 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="util" Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411594 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="extract-content" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411600 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="extract-content" Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411630 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="registry-server" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="registry-server" Dec 05 08:39:08 crc kubenswrapper[4795]: E1205 08:39:08.411667 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="pull" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="pull" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411780 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" containerName="registry-server" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.411797 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c87c120-561f-4ce2-b47c-b99fb3ea4283" containerName="extract" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.412302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.414574 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-k5nrv" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.467960 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb"] Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.603274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjqxd\" (UniqueName: \"kubernetes.io/projected/d04526ec-00e4-4fef-8a4f-346bac707512-kube-api-access-wjqxd\") pod \"openstack-operator-controller-operator-6ccbc6b756-8vnkb\" (UID: \"d04526ec-00e4-4fef-8a4f-346bac707512\") " pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.704708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjqxd\" (UniqueName: \"kubernetes.io/projected/d04526ec-00e4-4fef-8a4f-346bac707512-kube-api-access-wjqxd\") pod \"openstack-operator-controller-operator-6ccbc6b756-8vnkb\" (UID: \"d04526ec-00e4-4fef-8a4f-346bac707512\") " pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.744314 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjqxd\" (UniqueName: \"kubernetes.io/projected/d04526ec-00e4-4fef-8a4f-346bac707512-kube-api-access-wjqxd\") pod \"openstack-operator-controller-operator-6ccbc6b756-8vnkb\" (UID: \"d04526ec-00e4-4fef-8a4f-346bac707512\") " pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:08 crc kubenswrapper[4795]: I1205 08:39:08.771754 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4469ae0-55f4-4cd1-9711-01aadd1377b7" path="/var/lib/kubelet/pods/b4469ae0-55f4-4cd1-9711-01aadd1377b7/volumes" Dec 05 08:39:09 crc kubenswrapper[4795]: I1205 08:39:09.032516 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-k5nrv" Dec 05 08:39:09 crc kubenswrapper[4795]: I1205 08:39:09.041322 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:09 crc kubenswrapper[4795]: I1205 08:39:09.403506 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb"] Dec 05 08:39:09 crc kubenswrapper[4795]: I1205 08:39:09.765205 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:39:09 crc kubenswrapper[4795]: I1205 08:39:09.765525 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2m8sk" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="registry-server" containerID="cri-o://7bba1bafb156148fcd8b7c8758fb509be8bd7485951c6f325bcba1d144733dcd" gracePeriod=2 Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.226249 4795 generic.go:334] "Generic (PLEG): container finished" podID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerID="7bba1bafb156148fcd8b7c8758fb509be8bd7485951c6f325bcba1d144733dcd" exitCode=0 Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.226385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerDied","Data":"7bba1bafb156148fcd8b7c8758fb509be8bd7485951c6f325bcba1d144733dcd"} Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.226421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m8sk" event={"ID":"5bef636f-2d89-41e2-b924-1eb65f41183c","Type":"ContainerDied","Data":"f01eaf2ec415a3c6b3a0f35137739add0632be4afa8991f9666d57d2e14273e4"} Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.226440 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f01eaf2ec415a3c6b3a0f35137739add0632be4afa8991f9666d57d2e14273e4" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.229079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" event={"ID":"d04526ec-00e4-4fef-8a4f-346bac707512","Type":"ContainerStarted","Data":"4b7830c06e743a05b4ea9785e310a6147f240365916bef7ab10de896973a1024"} Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.252356 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.429215 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgx45\" (UniqueName: \"kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45\") pod \"5bef636f-2d89-41e2-b924-1eb65f41183c\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.429350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities\") pod \"5bef636f-2d89-41e2-b924-1eb65f41183c\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.429414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content\") pod \"5bef636f-2d89-41e2-b924-1eb65f41183c\" (UID: \"5bef636f-2d89-41e2-b924-1eb65f41183c\") " Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.431108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities" (OuterVolumeSpecName: "utilities") pod "5bef636f-2d89-41e2-b924-1eb65f41183c" (UID: "5bef636f-2d89-41e2-b924-1eb65f41183c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.456331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45" (OuterVolumeSpecName: "kube-api-access-mgx45") pod "5bef636f-2d89-41e2-b924-1eb65f41183c" (UID: "5bef636f-2d89-41e2-b924-1eb65f41183c"). InnerVolumeSpecName "kube-api-access-mgx45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.462912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bef636f-2d89-41e2-b924-1eb65f41183c" (UID: "5bef636f-2d89-41e2-b924-1eb65f41183c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.531527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgx45\" (UniqueName: \"kubernetes.io/projected/5bef636f-2d89-41e2-b924-1eb65f41183c-kube-api-access-mgx45\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.531563 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:10 crc kubenswrapper[4795]: I1205 08:39:10.531573 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bef636f-2d89-41e2-b924-1eb65f41183c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:11 crc kubenswrapper[4795]: I1205 08:39:11.236456 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m8sk" Dec 05 08:39:11 crc kubenswrapper[4795]: I1205 08:39:11.263888 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:39:11 crc kubenswrapper[4795]: I1205 08:39:11.268992 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m8sk"] Dec 05 08:39:12 crc kubenswrapper[4795]: I1205 08:39:12.758062 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" path="/var/lib/kubelet/pods/5bef636f-2d89-41e2-b924-1eb65f41183c/volumes" Dec 05 08:39:15 crc kubenswrapper[4795]: I1205 08:39:15.273287 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" event={"ID":"d04526ec-00e4-4fef-8a4f-346bac707512","Type":"ContainerStarted","Data":"2e5ed5720dc55f71e3bad84fe925672c8bf77e872dba25206f93bbc93ce324de"} Dec 05 08:39:15 crc kubenswrapper[4795]: I1205 08:39:15.274013 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:15 crc kubenswrapper[4795]: I1205 08:39:15.304993 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" podStartSLOduration=2.30316754 podStartE2EDuration="7.304968432s" podCreationTimestamp="2025-12-05 08:39:08 +0000 UTC" firstStartedPulling="2025-12-05 08:39:09.417507585 +0000 UTC m=+900.990111324" lastFinishedPulling="2025-12-05 08:39:14.419308477 +0000 UTC m=+905.991912216" observedRunningTime="2025-12-05 08:39:15.303734619 +0000 UTC m=+906.876338368" watchObservedRunningTime="2025-12-05 08:39:15.304968432 +0000 UTC m=+906.877572171" Dec 05 08:39:19 crc kubenswrapper[4795]: I1205 08:39:19.046318 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6ccbc6b756-8vnkb" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.876739 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llggw"] Dec 05 08:39:33 crc kubenswrapper[4795]: E1205 08:39:33.877814 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="extract-content" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.877835 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="extract-content" Dec 05 08:39:33 crc kubenswrapper[4795]: E1205 08:39:33.877846 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="extract-utilities" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.877853 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="extract-utilities" Dec 05 08:39:33 crc kubenswrapper[4795]: E1205 08:39:33.877876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="registry-server" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.877883 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="registry-server" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.878016 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bef636f-2d89-41e2-b924-1eb65f41183c" containerName="registry-server" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.879159 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:33 crc kubenswrapper[4795]: I1205 08:39:33.940653 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llggw"] Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.001120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-catalog-content\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.001213 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt488\" (UniqueName: \"kubernetes.io/projected/ac916cee-7b91-43d8-9383-029f9b983c8d-kube-api-access-pt488\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.001439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-utilities\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.102531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-utilities\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.102679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-catalog-content\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.102747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt488\" (UniqueName: \"kubernetes.io/projected/ac916cee-7b91-43d8-9383-029f9b983c8d-kube-api-access-pt488\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.103423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-utilities\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.103487 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac916cee-7b91-43d8-9383-029f9b983c8d-catalog-content\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.134248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt488\" (UniqueName: \"kubernetes.io/projected/ac916cee-7b91-43d8-9383-029f9b983c8d-kube-api-access-pt488\") pod \"certified-operators-llggw\" (UID: \"ac916cee-7b91-43d8-9383-029f9b983c8d\") " pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.203924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:34 crc kubenswrapper[4795]: I1205 08:39:34.616195 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llggw"] Dec 05 08:39:35 crc kubenswrapper[4795]: I1205 08:39:35.625062 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac916cee-7b91-43d8-9383-029f9b983c8d" containerID="85a8acb74b0797aca460d7920c703c06da311f99ea73c59c49b0946af4061da4" exitCode=0 Dec 05 08:39:35 crc kubenswrapper[4795]: I1205 08:39:35.625125 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llggw" event={"ID":"ac916cee-7b91-43d8-9383-029f9b983c8d","Type":"ContainerDied","Data":"85a8acb74b0797aca460d7920c703c06da311f99ea73c59c49b0946af4061da4"} Dec 05 08:39:35 crc kubenswrapper[4795]: I1205 08:39:35.626807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llggw" event={"ID":"ac916cee-7b91-43d8-9383-029f9b983c8d","Type":"ContainerStarted","Data":"25e38110b5ac07e606d13813e4100c743ce688290534030678a6317e08265fcb"} Dec 05 08:39:44 crc kubenswrapper[4795]: I1205 08:39:44.771088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llggw" event={"ID":"ac916cee-7b91-43d8-9383-029f9b983c8d","Type":"ContainerStarted","Data":"d8ffa1608d5a3ec2f0969ed0b489f9a63423b8b4377357e06ae5da2a36826aae"} Dec 05 08:39:45 crc kubenswrapper[4795]: I1205 08:39:45.783763 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac916cee-7b91-43d8-9383-029f9b983c8d" containerID="d8ffa1608d5a3ec2f0969ed0b489f9a63423b8b4377357e06ae5da2a36826aae" exitCode=0 Dec 05 08:39:45 crc kubenswrapper[4795]: I1205 08:39:45.784557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llggw" event={"ID":"ac916cee-7b91-43d8-9383-029f9b983c8d","Type":"ContainerDied","Data":"d8ffa1608d5a3ec2f0969ed0b489f9a63423b8b4377357e06ae5da2a36826aae"} Dec 05 08:39:46 crc kubenswrapper[4795]: I1205 08:39:46.793832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llggw" event={"ID":"ac916cee-7b91-43d8-9383-029f9b983c8d","Type":"ContainerStarted","Data":"39b3a83acf2501fa82e090ef19e050e07f121b28a46ebf0756cc7b314b8542cc"} Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.205245 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.205978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.263604 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.322443 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llggw" podStartSLOduration=10.699341822 podStartE2EDuration="21.322419783s" podCreationTimestamp="2025-12-05 08:39:33 +0000 UTC" firstStartedPulling="2025-12-05 08:39:35.628165172 +0000 UTC m=+927.200768911" lastFinishedPulling="2025-12-05 08:39:46.251243133 +0000 UTC m=+937.823846872" observedRunningTime="2025-12-05 08:39:46.838975804 +0000 UTC m=+938.411579543" watchObservedRunningTime="2025-12-05 08:39:54.322419783 +0000 UTC m=+945.895023522" Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.893401 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llggw" Dec 05 08:39:54 crc kubenswrapper[4795]: I1205 08:39:54.982981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llggw"] Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.048738 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.049042 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ksjsh" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="registry-server" containerID="cri-o://469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03" gracePeriod=2 Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.531127 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.665210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm7ws\" (UniqueName: \"kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws\") pod \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.665308 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content\") pod \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.665359 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities\") pod \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\" (UID: \"12d2c3d5-9182-44a8-9ad9-26b54dc3135b\") " Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.666520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities" (OuterVolumeSpecName: "utilities") pod "12d2c3d5-9182-44a8-9ad9-26b54dc3135b" (UID: "12d2c3d5-9182-44a8-9ad9-26b54dc3135b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.724597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12d2c3d5-9182-44a8-9ad9-26b54dc3135b" (UID: "12d2c3d5-9182-44a8-9ad9-26b54dc3135b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.731796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws" (OuterVolumeSpecName: "kube-api-access-tm7ws") pod "12d2c3d5-9182-44a8-9ad9-26b54dc3135b" (UID: "12d2c3d5-9182-44a8-9ad9-26b54dc3135b"). InnerVolumeSpecName "kube-api-access-tm7ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.767805 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm7ws\" (UniqueName: \"kubernetes.io/projected/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-kube-api-access-tm7ws\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.767864 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.767882 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12d2c3d5-9182-44a8-9ad9-26b54dc3135b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.858523 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ksjsh" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.858472 4795 generic.go:334] "Generic (PLEG): container finished" podID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerID="469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03" exitCode=0 Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.858562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerDied","Data":"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03"} Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.858723 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ksjsh" event={"ID":"12d2c3d5-9182-44a8-9ad9-26b54dc3135b","Type":"ContainerDied","Data":"72df7d1181a9ac84a8cd371fbedf6e0c588b9f4da746b3238891709348df8e45"} Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.858750 4795 scope.go:117] "RemoveContainer" containerID="469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.879748 4795 scope.go:117] "RemoveContainer" containerID="c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.895696 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.900086 4795 scope.go:117] "RemoveContainer" containerID="ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.900900 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ksjsh"] Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.928792 4795 scope.go:117] "RemoveContainer" containerID="469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03" Dec 05 08:39:55 crc kubenswrapper[4795]: E1205 08:39:55.931047 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03\": container with ID starting with 469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03 not found: ID does not exist" containerID="469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.931087 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03"} err="failed to get container status \"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03\": rpc error: code = NotFound desc = could not find container \"469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03\": container with ID starting with 469beeaaf9d2a781e839e2a02a422752b3df69c25674f2c1a4d16ad8e31fcf03 not found: ID does not exist" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.931118 4795 scope.go:117] "RemoveContainer" containerID="c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351" Dec 05 08:39:55 crc kubenswrapper[4795]: E1205 08:39:55.932476 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351\": container with ID starting with c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351 not found: ID does not exist" containerID="c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.934717 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351"} err="failed to get container status \"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351\": rpc error: code = NotFound desc = could not find container \"c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351\": container with ID starting with c4311a13eca7ff3547ac2800ea26d4590156dd3b203fb3afa0ade9fc0852c351 not found: ID does not exist" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.934763 4795 scope.go:117] "RemoveContainer" containerID="ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308" Dec 05 08:39:55 crc kubenswrapper[4795]: E1205 08:39:55.935301 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308\": container with ID starting with ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308 not found: ID does not exist" containerID="ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308" Dec 05 08:39:55 crc kubenswrapper[4795]: I1205 08:39:55.935330 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308"} err="failed to get container status \"ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308\": rpc error: code = NotFound desc = could not find container \"ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308\": container with ID starting with ec438e0e15070a0117b427dcead6826fc75b8f9b5a0c64d62d927231ccbfd308 not found: ID does not exist" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.277487 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh"] Dec 05 08:39:56 crc kubenswrapper[4795]: E1205 08:39:56.277785 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="extract-utilities" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.277804 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="extract-utilities" Dec 05 08:39:56 crc kubenswrapper[4795]: E1205 08:39:56.277826 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="extract-content" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.277833 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="extract-content" Dec 05 08:39:56 crc kubenswrapper[4795]: E1205 08:39:56.277846 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="registry-server" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.277852 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="registry-server" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.277955 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" containerName="registry-server" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.278605 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.281032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fs62p" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.310986 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.312532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.318127 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ntmll" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.327994 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.329064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.338876 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.342655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xnr7f" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.372902 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.381901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkmv\" (UniqueName: \"kubernetes.io/projected/f42b62b8-5856-4300-8bf6-b2299f1b5612-kube-api-access-6xkmv\") pod \"barbican-operator-controller-manager-7d9dfd778-fgtwh\" (UID: \"f42b62b8-5856-4300-8bf6-b2299f1b5612\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.382123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwg92\" (UniqueName: \"kubernetes.io/projected/4d920ea1-76ae-4bb3-831f-e83ac4d57fbe-kube-api-access-bwg92\") pod \"cinder-operator-controller-manager-859b6ccc6-jcbss\" (UID: \"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.382203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmxl5\" (UniqueName: \"kubernetes.io/projected/67fad932-d045-4ee7-ae85-bf528a431eb3-kube-api-access-dmxl5\") pod \"designate-operator-controller-manager-78b4bc895b-s6td2\" (UID: \"67fad932-d045-4ee7-ae85-bf528a431eb3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.427000 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.445692 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.474380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5xrnt" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.476691 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.491681 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.495009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tjg\" (UniqueName: \"kubernetes.io/projected/6fb6884d-9a5b-40bd-bc15-d51a6a645645-kube-api-access-58tjg\") pod \"glance-operator-controller-manager-77987cd8cd-kvhsl\" (UID: \"6fb6884d-9a5b-40bd-bc15-d51a6a645645\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.495083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkmv\" (UniqueName: \"kubernetes.io/projected/f42b62b8-5856-4300-8bf6-b2299f1b5612-kube-api-access-6xkmv\") pod \"barbican-operator-controller-manager-7d9dfd778-fgtwh\" (UID: \"f42b62b8-5856-4300-8bf6-b2299f1b5612\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.495171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwg92\" (UniqueName: \"kubernetes.io/projected/4d920ea1-76ae-4bb3-831f-e83ac4d57fbe-kube-api-access-bwg92\") pod \"cinder-operator-controller-manager-859b6ccc6-jcbss\" (UID: \"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.495197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmxl5\" (UniqueName: \"kubernetes.io/projected/67fad932-d045-4ee7-ae85-bf528a431eb3-kube-api-access-dmxl5\") pod \"designate-operator-controller-manager-78b4bc895b-s6td2\" (UID: \"67fad932-d045-4ee7-ae85-bf528a431eb3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.523689 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.524982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.535215 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.536452 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.541097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmxl5\" (UniqueName: \"kubernetes.io/projected/67fad932-d045-4ee7-ae85-bf528a431eb3-kube-api-access-dmxl5\") pod \"designate-operator-controller-manager-78b4bc895b-s6td2\" (UID: \"67fad932-d045-4ee7-ae85-bf528a431eb3\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.551746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkmv\" (UniqueName: \"kubernetes.io/projected/f42b62b8-5856-4300-8bf6-b2299f1b5612-kube-api-access-6xkmv\") pod \"barbican-operator-controller-manager-7d9dfd778-fgtwh\" (UID: \"f42b62b8-5856-4300-8bf6-b2299f1b5612\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.571329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwg92\" (UniqueName: \"kubernetes.io/projected/4d920ea1-76ae-4bb3-831f-e83ac4d57fbe-kube-api-access-bwg92\") pod \"cinder-operator-controller-manager-859b6ccc6-jcbss\" (UID: \"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.571960 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kt76j" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.578867 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.583706 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.585307 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.590372 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-b9z7z" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.597699 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.598105 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.599599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq457\" (UniqueName: \"kubernetes.io/projected/3bb0c684-dce6-453f-b3ba-184b11da37c8-kube-api-access-hq457\") pod \"horizon-operator-controller-manager-68c6d99b8f-j6sk2\" (UID: \"3bb0c684-dce6-453f-b3ba-184b11da37c8\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.599725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58tjg\" (UniqueName: \"kubernetes.io/projected/6fb6884d-9a5b-40bd-bc15-d51a6a645645-kube-api-access-58tjg\") pod \"glance-operator-controller-manager-77987cd8cd-kvhsl\" (UID: \"6fb6884d-9a5b-40bd-bc15-d51a6a645645\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.599765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdtxq\" (UniqueName: \"kubernetes.io/projected/e8ef9580-dae6-4db8-aa6d-5c600b8ae507-kube-api-access-jdtxq\") pod \"heat-operator-controller-manager-5f64f6f8bb-t29gj\" (UID: \"e8ef9580-dae6-4db8-aa6d-5c600b8ae507\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.622066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.624536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.624850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bzxct" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.634836 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.656501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.662720 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.664322 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.669239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-k2mbn" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.689522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tjg\" (UniqueName: \"kubernetes.io/projected/6fb6884d-9a5b-40bd-bc15-d51a6a645645-kube-api-access-58tjg\") pod \"glance-operator-controller-manager-77987cd8cd-kvhsl\" (UID: \"6fb6884d-9a5b-40bd-bc15-d51a6a645645\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.702637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfxn\" (UniqueName: \"kubernetes.io/projected/2deba92d-4689-450c-95e7-36cb8fc196c1-kube-api-access-shfxn\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.702716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq457\" (UniqueName: \"kubernetes.io/projected/3bb0c684-dce6-453f-b3ba-184b11da37c8-kube-api-access-hq457\") pod \"horizon-operator-controller-manager-68c6d99b8f-j6sk2\" (UID: \"3bb0c684-dce6-453f-b3ba-184b11da37c8\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.702753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.702785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdtxq\" (UniqueName: \"kubernetes.io/projected/e8ef9580-dae6-4db8-aa6d-5c600b8ae507-kube-api-access-jdtxq\") pod \"heat-operator-controller-manager-5f64f6f8bb-t29gj\" (UID: \"e8ef9580-dae6-4db8-aa6d-5c600b8ae507\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.705444 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.775003 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.776445 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d2c3d5-9182-44a8-9ad9-26b54dc3135b" path="/var/lib/kubelet/pods/12d2c3d5-9182-44a8-9ad9-26b54dc3135b/volumes" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.778167 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq457\" (UniqueName: \"kubernetes.io/projected/3bb0c684-dce6-453f-b3ba-184b11da37c8-kube-api-access-hq457\") pod \"horizon-operator-controller-manager-68c6d99b8f-j6sk2\" (UID: \"3bb0c684-dce6-453f-b3ba-184b11da37c8\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.787351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdtxq\" (UniqueName: \"kubernetes.io/projected/e8ef9580-dae6-4db8-aa6d-5c600b8ae507-kube-api-access-jdtxq\") pod \"heat-operator-controller-manager-5f64f6f8bb-t29gj\" (UID: \"e8ef9580-dae6-4db8-aa6d-5c600b8ae507\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.810779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.810900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfxn\" (UniqueName: \"kubernetes.io/projected/2deba92d-4689-450c-95e7-36cb8fc196c1-kube-api-access-shfxn\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.810968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g772v\" (UniqueName: \"kubernetes.io/projected/61b3d615-1654-4fc5-a601-43f68103ac52-kube-api-access-g772v\") pod \"ironic-operator-controller-manager-6c548fd776-b9vm2\" (UID: \"61b3d615-1654-4fc5-a601-43f68103ac52\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:39:56 crc kubenswrapper[4795]: E1205 08:39:56.811215 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:56 crc kubenswrapper[4795]: E1205 08:39:56.811280 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:57.311255549 +0000 UTC m=+948.883859288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.823248 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.824572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.836533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-sg6hg" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.860692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.866742 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.868059 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.874223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-qjm6j" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.888297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfxn\" (UniqueName: \"kubernetes.io/projected/2deba92d-4689-450c-95e7-36cb8fc196c1-kube-api-access-shfxn\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.906713 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l"] Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.928779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvblx\" (UniqueName: \"kubernetes.io/projected/8db4219d-3a4b-4470-9c6d-db1b98c9b3dc-kube-api-access-kvblx\") pod \"keystone-operator-controller-manager-7765d96ddf-8dh85\" (UID: \"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.928917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g772v\" (UniqueName: \"kubernetes.io/projected/61b3d615-1654-4fc5-a601-43f68103ac52-kube-api-access-g772v\") pod \"ironic-operator-controller-manager-6c548fd776-b9vm2\" (UID: \"61b3d615-1654-4fc5-a601-43f68103ac52\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.928944 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtnkf\" (UniqueName: \"kubernetes.io/projected/63be1623-e1cd-4904-99cb-9497a6596599-kube-api-access-dtnkf\") pod \"manila-operator-controller-manager-7c79b5df47-tkk5l\" (UID: \"63be1623-e1cd-4904-99cb-9497a6596599\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:39:56 crc kubenswrapper[4795]: I1205 08:39:56.978003 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.003141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g772v\" (UniqueName: \"kubernetes.io/projected/61b3d615-1654-4fc5-a601-43f68103ac52-kube-api-access-g772v\") pod \"ironic-operator-controller-manager-6c548fd776-b9vm2\" (UID: \"61b3d615-1654-4fc5-a601-43f68103ac52\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.003380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.030483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtnkf\" (UniqueName: \"kubernetes.io/projected/63be1623-e1cd-4904-99cb-9497a6596599-kube-api-access-dtnkf\") pod \"manila-operator-controller-manager-7c79b5df47-tkk5l\" (UID: \"63be1623-e1cd-4904-99cb-9497a6596599\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.030996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvblx\" (UniqueName: \"kubernetes.io/projected/8db4219d-3a4b-4470-9c6d-db1b98c9b3dc-kube-api-access-kvblx\") pod \"keystone-operator-controller-manager-7765d96ddf-8dh85\" (UID: \"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.042732 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.044009 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.052333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.053010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-m6p2n" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.078236 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.094799 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.096133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.114879 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.116236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.119431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtnkf\" (UniqueName: \"kubernetes.io/projected/63be1623-e1cd-4904-99cb-9497a6596599-kube-api-access-dtnkf\") pod \"manila-operator-controller-manager-7c79b5df47-tkk5l\" (UID: \"63be1623-e1cd-4904-99cb-9497a6596599\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.132184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvblx\" (UniqueName: \"kubernetes.io/projected/8db4219d-3a4b-4470-9c6d-db1b98c9b3dc-kube-api-access-kvblx\") pod \"keystone-operator-controller-manager-7765d96ddf-8dh85\" (UID: \"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.132573 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ppm6\" (UniqueName: \"kubernetes.io/projected/df2401aa-47d5-4301-93ea-41a8c8b32cc9-kube-api-access-7ppm6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-fxncz\" (UID: \"df2401aa-47d5-4301-93ea-41a8c8b32cc9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.135878 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.136006 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dmvkg" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.164037 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-blpbz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.164256 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-n89xx"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.165380 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.175798 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.187254 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8zfmd" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.206678 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.234852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7j46\" (UniqueName: \"kubernetes.io/projected/b41e588d-948f-4709-8717-cfbe8fbba4c9-kube-api-access-c7j46\") pod \"nova-operator-controller-manager-697bc559fc-mbkw8\" (UID: \"b41e588d-948f-4709-8717-cfbe8fbba4c9\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.235360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9b98\" (UniqueName: \"kubernetes.io/projected/33394c60-0058-4c0a-8582-cdd95c25bd19-kube-api-access-s9b98\") pod \"octavia-operator-controller-manager-998648c74-n89xx\" (UID: \"33394c60-0058-4c0a-8582-cdd95c25bd19\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.235420 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-n89xx"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.235462 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ppm6\" (UniqueName: \"kubernetes.io/projected/df2401aa-47d5-4301-93ea-41a8c8b32cc9-kube-api-access-7ppm6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-fxncz\" (UID: \"df2401aa-47d5-4301-93ea-41a8c8b32cc9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.235498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcwbb\" (UniqueName: \"kubernetes.io/projected/e8acf865-8373-4a37-ba22-bc276e596f2d-kube-api-access-dcwbb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-m6snn\" (UID: \"e8acf865-8373-4a37-ba22-bc276e596f2d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.272011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.347934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcwbb\" (UniqueName: \"kubernetes.io/projected/e8acf865-8373-4a37-ba22-bc276e596f2d-kube-api-access-dcwbb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-m6snn\" (UID: \"e8acf865-8373-4a37-ba22-bc276e596f2d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.348033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7j46\" (UniqueName: \"kubernetes.io/projected/b41e588d-948f-4709-8717-cfbe8fbba4c9-kube-api-access-c7j46\") pod \"nova-operator-controller-manager-697bc559fc-mbkw8\" (UID: \"b41e588d-948f-4709-8717-cfbe8fbba4c9\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.348081 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9b98\" (UniqueName: \"kubernetes.io/projected/33394c60-0058-4c0a-8582-cdd95c25bd19-kube-api-access-s9b98\") pod \"octavia-operator-controller-manager-998648c74-n89xx\" (UID: \"33394c60-0058-4c0a-8582-cdd95c25bd19\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.348125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:57 crc kubenswrapper[4795]: E1205 08:39:57.348494 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:57 crc kubenswrapper[4795]: E1205 08:39:57.348560 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:58.348538816 +0000 UTC m=+949.921142555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.421657 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.448253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcwbb\" (UniqueName: \"kubernetes.io/projected/e8acf865-8373-4a37-ba22-bc276e596f2d-kube-api-access-dcwbb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-m6snn\" (UID: \"e8acf865-8373-4a37-ba22-bc276e596f2d\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.477725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.488320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ppm6\" (UniqueName: \"kubernetes.io/projected/df2401aa-47d5-4301-93ea-41a8c8b32cc9-kube-api-access-7ppm6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-fxncz\" (UID: \"df2401aa-47d5-4301-93ea-41a8c8b32cc9\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.494513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9b98\" (UniqueName: \"kubernetes.io/projected/33394c60-0058-4c0a-8582-cdd95c25bd19-kube-api-access-s9b98\") pod \"octavia-operator-controller-manager-998648c74-n89xx\" (UID: \"33394c60-0058-4c0a-8582-cdd95c25bd19\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.497453 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n8pq2" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.497494 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.530281 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.536144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7j46\" (UniqueName: \"kubernetes.io/projected/b41e588d-948f-4709-8717-cfbe8fbba4c9-kube-api-access-c7j46\") pod \"nova-operator-controller-manager-697bc559fc-mbkw8\" (UID: \"b41e588d-948f-4709-8717-cfbe8fbba4c9\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.545487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.547336 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.628521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.629827 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.633524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.673683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6mc\" (UniqueName: \"kubernetes.io/projected/60a49846-77cd-440b-b8b2-988cd340dd18-kube-api-access-xw6mc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.673751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmgxw\" (UniqueName: \"kubernetes.io/projected/4759b941-a4a1-470a-99e1-9acc898804e9-kube-api-access-zmgxw\") pod \"ovn-operator-controller-manager-b6456fdb6-mjbz5\" (UID: \"4759b941-a4a1-470a-99e1-9acc898804e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.673826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.691786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-f4qzj" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.696523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.702119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.765870 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.779743 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.792478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6mc\" (UniqueName: \"kubernetes.io/projected/60a49846-77cd-440b-b8b2-988cd340dd18-kube-api-access-xw6mc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.792770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmgxw\" (UniqueName: \"kubernetes.io/projected/4759b941-a4a1-470a-99e1-9acc898804e9-kube-api-access-zmgxw\") pod \"ovn-operator-controller-manager-b6456fdb6-mjbz5\" (UID: \"4759b941-a4a1-470a-99e1-9acc898804e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.792817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:57 crc kubenswrapper[4795]: E1205 08:39:57.792959 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:57 crc kubenswrapper[4795]: E1205 08:39:57.793021 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:58.293000786 +0000 UTC m=+949.865604525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.795301 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-cdc8s" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.829979 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn"] Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.905635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cfhx\" (UniqueName: \"kubernetes.io/projected/a0f6ca90-a15b-4fd1-b934-c4428e4c0d90-kube-api-access-6cfhx\") pod \"placement-operator-controller-manager-78f8948974-mjmbn\" (UID: \"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.933631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmgxw\" (UniqueName: \"kubernetes.io/projected/4759b941-a4a1-470a-99e1-9acc898804e9-kube-api-access-zmgxw\") pod \"ovn-operator-controller-manager-b6456fdb6-mjbz5\" (UID: \"4759b941-a4a1-470a-99e1-9acc898804e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:39:57 crc kubenswrapper[4795]: I1205 08:39:57.936426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6mc\" (UniqueName: \"kubernetes.io/projected/60a49846-77cd-440b-b8b2-988cd340dd18-kube-api-access-xw6mc\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.007355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cfhx\" (UniqueName: \"kubernetes.io/projected/a0f6ca90-a15b-4fd1-b934-c4428e4c0d90-kube-api-access-6cfhx\") pod \"placement-operator-controller-manager-78f8948974-mjmbn\" (UID: \"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.047920 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.054287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cfhx\" (UniqueName: \"kubernetes.io/projected/a0f6ca90-a15b-4fd1-b934-c4428e4c0d90-kube-api-access-6cfhx\") pod \"placement-operator-controller-manager-78f8948974-mjmbn\" (UID: \"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.105295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" event={"ID":"f42b62b8-5856-4300-8bf6-b2299f1b5612","Type":"ContainerStarted","Data":"35add1734ba99e0c926e00bafc702267ed5c5beae50da49cc54111cd96e7a26d"} Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.113011 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.131830 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.133102 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.138053 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5frk7" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.148712 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.150156 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.154979 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lg92k" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.163271 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.164349 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.173847 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-l7slt" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.191510 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.213969 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.221757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5nf\" (UniqueName: \"kubernetes.io/projected/6c624ff1-59b4-4d7a-af4c-0dd48235842b-kube-api-access-kg5nf\") pod \"swift-operator-controller-manager-5f8c65bbfc-stfxh\" (UID: \"6c624ff1-59b4-4d7a-af4c-0dd48235842b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.221833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6n2q\" (UniqueName: \"kubernetes.io/projected/75cb47f2-6153-4f9b-9634-151793360092-kube-api-access-p6n2q\") pod \"telemetry-operator-controller-manager-76cc84c6bb-vbzr4\" (UID: \"75cb47f2-6153-4f9b-9634-151793360092\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.221924 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjpq\" (UniqueName: \"kubernetes.io/projected/4d769f5c-b4d9-4049-9cf8-73d02b343b1f-kube-api-access-zmjpq\") pod \"test-operator-controller-manager-5854674fcc-xbwwz\" (UID: \"4d769f5c-b4d9-4049-9cf8-73d02b343b1f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.222283 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.233990 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.244660 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.245903 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.249192 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-szvn2" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.266257 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.267401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.273687 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.283144 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.283323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.283478 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kbh7x" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.332837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdg9\" (UniqueName: \"kubernetes.io/projected/b993e6ee-cadd-4671-99dd-bb54433c0064-kube-api-access-9rdg9\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.332900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjpq\" (UniqueName: \"kubernetes.io/projected/4d769f5c-b4d9-4049-9cf8-73d02b343b1f-kube-api-access-zmjpq\") pod \"test-operator-controller-manager-5854674fcc-xbwwz\" (UID: \"4d769f5c-b4d9-4049-9cf8-73d02b343b1f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.332965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.332995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.333023 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.333058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z2p\" (UniqueName: \"kubernetes.io/projected/201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3-kube-api-access-q6z2p\") pod \"watcher-operator-controller-manager-769dc69bc-pzc2j\" (UID: \"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.333104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5nf\" (UniqueName: \"kubernetes.io/projected/6c624ff1-59b4-4d7a-af4c-0dd48235842b-kube-api-access-kg5nf\") pod \"swift-operator-controller-manager-5f8c65bbfc-stfxh\" (UID: \"6c624ff1-59b4-4d7a-af4c-0dd48235842b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.333133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6n2q\" (UniqueName: \"kubernetes.io/projected/75cb47f2-6153-4f9b-9634-151793360092-kube-api-access-p6n2q\") pod \"telemetry-operator-controller-manager-76cc84c6bb-vbzr4\" (UID: \"75cb47f2-6153-4f9b-9634-151793360092\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.333451 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.333498 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:59.33348118 +0000 UTC m=+950.906084919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.434197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdg9\" (UniqueName: \"kubernetes.io/projected/b993e6ee-cadd-4671-99dd-bb54433c0064-kube-api-access-9rdg9\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.435110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.435474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.435680 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z2p\" (UniqueName: \"kubernetes.io/projected/201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3-kube-api-access-q6z2p\") pod \"watcher-operator-controller-manager-769dc69bc-pzc2j\" (UID: \"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.435896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.434454 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5nf\" (UniqueName: \"kubernetes.io/projected/6c624ff1-59b4-4d7a-af4c-0dd48235842b-kube-api-access-kg5nf\") pod \"swift-operator-controller-manager-5f8c65bbfc-stfxh\" (UID: \"6c624ff1-59b4-4d7a-af4c-0dd48235842b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.435368 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.439843 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:58.939815145 +0000 UTC m=+950.512418884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "metrics-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.440049 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6n2q\" (UniqueName: \"kubernetes.io/projected/75cb47f2-6153-4f9b-9634-151793360092-kube-api-access-p6n2q\") pod \"telemetry-operator-controller-manager-76cc84c6bb-vbzr4\" (UID: \"75cb47f2-6153-4f9b-9634-151793360092\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.436131 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.440199 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:00.440174574 +0000 UTC m=+952.012778313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.439581 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.440553 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:58.940506643 +0000 UTC m=+950.513110382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.447784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjpq\" (UniqueName: \"kubernetes.io/projected/4d769f5c-b4d9-4049-9cf8-73d02b343b1f-kube-api-access-zmjpq\") pod \"test-operator-controller-manager-5854674fcc-xbwwz\" (UID: \"4d769f5c-b4d9-4049-9cf8-73d02b343b1f\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.476145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.501377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z2p\" (UniqueName: \"kubernetes.io/projected/201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3-kube-api-access-q6z2p\") pod \"watcher-operator-controller-manager-769dc69bc-pzc2j\" (UID: \"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.523769 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.526301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdg9\" (UniqueName: \"kubernetes.io/projected/b993e6ee-cadd-4671-99dd-bb54433c0064-kube-api-access-9rdg9\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.558694 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.559928 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.562160 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.568031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bn6sx" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.585564 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.605024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.653232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxxzn\" (UniqueName: \"kubernetes.io/projected/d83675df-1935-473f-925e-5b40d61aadfa-kube-api-access-mxxzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g6ghd\" (UID: \"d83675df-1935-473f-925e-5b40d61aadfa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.710884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.748842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.778870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxxzn\" (UniqueName: \"kubernetes.io/projected/d83675df-1935-473f-925e-5b40d61aadfa-kube-api-access-mxxzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g6ghd\" (UID: \"d83675df-1935-473f-925e-5b40d61aadfa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.840022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxxzn\" (UniqueName: \"kubernetes.io/projected/d83675df-1935-473f-925e-5b40d61aadfa-kube-api-access-mxxzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g6ghd\" (UID: \"d83675df-1935-473f-925e-5b40d61aadfa\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.880895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.895146 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl"] Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.991739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: I1205 08:39:58.991848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.991962 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.992065 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:59.992038998 +0000 UTC m=+951.564642727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "metrics-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.992081 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:39:58 crc kubenswrapper[4795]: E1205 08:39:58.992150 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:39:59.992130321 +0000 UTC m=+951.564734060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.031548 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.197925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" event={"ID":"67fad932-d045-4ee7-ae85-bf528a431eb3","Type":"ContainerStarted","Data":"62ac13950c28c16b72822424dfd492e8dfc050dc3ea1989c39531c8494344e2c"} Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.257969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" event={"ID":"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe","Type":"ContainerStarted","Data":"2be6dde5e3ed0832cdb1621c47504ca6a33f297cf5a75d3f6d9887c6b8e6bde9"} Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.407636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:39:59 crc kubenswrapper[4795]: E1205 08:39:59.407875 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:59 crc kubenswrapper[4795]: E1205 08:39:59.407984 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:01.407960631 +0000 UTC m=+952.980564360 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.740803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2"] Dec 05 08:39:59 crc kubenswrapper[4795]: I1205 08:39:59.759427 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.035981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.036067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.036283 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.036367 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:02.036346219 +0000 UTC m=+953.608949958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.036870 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.036902 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:02.036893573 +0000 UTC m=+953.609497312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "metrics-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.132640 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-n89xx"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.201079 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.298591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" event={"ID":"61b3d615-1654-4fc5-a601-43f68103ac52","Type":"ContainerStarted","Data":"04f62eb760082beaccf5f21c75801ffd482526b430f7319cf6b4e3757d018c96"} Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.299280 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.329256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" event={"ID":"63be1623-e1cd-4904-99cb-9497a6596599","Type":"ContainerStarted","Data":"d2925653191bdc085d98656820a6e8483b538d4111be53c78bdcf8d099f70a94"} Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.337803 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj"] Dec 05 08:40:00 crc kubenswrapper[4795]: W1205 08:40:00.341842 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8acf865_8373_4a37_ba22_bc276e596f2d.slice/crio-91410500b04f7dab9f474c8ff4288c48cd0a0f32f1e08280f3779d01f9aad0db WatchSource:0}: Error finding container 91410500b04f7dab9f474c8ff4288c48cd0a0f32f1e08280f3779d01f9aad0db: Status 404 returned error can't find the container with id 91410500b04f7dab9f474c8ff4288c48cd0a0f32f1e08280f3779d01f9aad0db Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.343114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" event={"ID":"3bb0c684-dce6-453f-b3ba-184b11da37c8","Type":"ContainerStarted","Data":"be1d0de39033fc02dee7f79bd970a00832a3e6a45dfe998855f2dfa22acf50c7"} Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.344106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" event={"ID":"33394c60-0058-4c0a-8582-cdd95c25bd19","Type":"ContainerStarted","Data":"4529a33420771d4a52679dd85a5301943ac47cb23ccee4e15bb5b2d020d64c3a"} Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.363992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" event={"ID":"6fb6884d-9a5b-40bd-bc15-d51a6a645645","Type":"ContainerStarted","Data":"c11d03d602b4ce0e9562df9327bbc5356f2a4232bf6467d5bd9bd083110ceb93"} Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.372377 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.409093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.451116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.451386 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.451456 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:04.451436929 +0000 UTC m=+956.024040668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.497131 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85"] Dec 05 08:40:00 crc kubenswrapper[4795]: W1205 08:40:00.519663 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41e588d_948f_4709_8717_cfbe8fbba4c9.slice/crio-c714249624fbf2b6664ab6ad52dcdd9b94047ea8cadddfe7a11029845e80c990 WatchSource:0}: Error finding container c714249624fbf2b6664ab6ad52dcdd9b94047ea8cadddfe7a11029845e80c990: Status 404 returned error can't find the container with id c714249624fbf2b6664ab6ad52dcdd9b94047ea8cadddfe7a11029845e80c990 Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.529270 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.549628 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.618533 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz"] Dec 05 08:40:00 crc kubenswrapper[4795]: W1205 08:40:00.620842 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c624ff1_59b4_4d7a_af4c_0dd48235842b.slice/crio-f2cc639e9c25115bd66f35a650ecec9939f66dfeb20a6e85592c14a22e7dd088 WatchSource:0}: Error finding container f2cc639e9c25115bd66f35a650ecec9939f66dfeb20a6e85592c14a22e7dd088: Status 404 returned error can't find the container with id f2cc639e9c25115bd66f35a650ecec9939f66dfeb20a6e85592c14a22e7dd088 Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.634365 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.644588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd"] Dec 05 08:40:00 crc kubenswrapper[4795]: W1205 08:40:00.686088 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd83675df_1935_473f_925e_5b40d61aadfa.slice/crio-9ea848b9e7bb430953b93bb2ce5708d5c77f63b3b7a2b8fecd0d4b014d14bc74 WatchSource:0}: Error finding container 9ea848b9e7bb430953b93bb2ce5708d5c77f63b3b7a2b8fecd0d4b014d14bc74: Status 404 returned error can't find the container with id 9ea848b9e7bb430953b93bb2ce5708d5c77f63b3b7a2b8fecd0d4b014d14bc74 Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.778391 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j"] Dec 05 08:40:00 crc kubenswrapper[4795]: I1205 08:40:00.814558 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz"] Dec 05 08:40:00 crc kubenswrapper[4795]: W1205 08:40:00.817376 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201a6dcf_235e_4ac0_b42f_9dfa86c2ffe3.slice/crio-9650471031b13f4335c1e8f9310354aa26f6f06f52fbc59a675b45db4c2e79ea WatchSource:0}: Error finding container 9650471031b13f4335c1e8f9310354aa26f6f06f52fbc59a675b45db4c2e79ea: Status 404 returned error can't find the container with id 9650471031b13f4335c1e8f9310354aa26f6f06f52fbc59a675b45db4c2e79ea Dec 05 08:40:00 crc kubenswrapper[4795]: E1205 08:40:00.992806 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q6z2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-pzc2j_openstack-operators(201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 08:40:01 crc kubenswrapper[4795]: E1205 08:40:01.005170 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q6z2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-pzc2j_openstack-operators(201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 05 08:40:01 crc kubenswrapper[4795]: E1205 08:40:01.006451 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podUID="201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3" Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.502738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" event={"ID":"b41e588d-948f-4709-8717-cfbe8fbba4c9","Type":"ContainerStarted","Data":"c714249624fbf2b6664ab6ad52dcdd9b94047ea8cadddfe7a11029845e80c990"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.507718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" event={"ID":"e8acf865-8373-4a37-ba22-bc276e596f2d","Type":"ContainerStarted","Data":"91410500b04f7dab9f474c8ff4288c48cd0a0f32f1e08280f3779d01f9aad0db"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.508454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:01 crc kubenswrapper[4795]: E1205 08:40:01.508680 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:01 crc kubenswrapper[4795]: E1205 08:40:01.508737 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:05.508719942 +0000 UTC m=+957.081323681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.516741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" event={"ID":"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc","Type":"ContainerStarted","Data":"d70e2815b55d51b12d95cfc000f6c976b828ca4e72190918d971d71166587fde"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.524707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" event={"ID":"d83675df-1935-473f-925e-5b40d61aadfa","Type":"ContainerStarted","Data":"9ea848b9e7bb430953b93bb2ce5708d5c77f63b3b7a2b8fecd0d4b014d14bc74"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.534840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" event={"ID":"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3","Type":"ContainerStarted","Data":"9650471031b13f4335c1e8f9310354aa26f6f06f52fbc59a675b45db4c2e79ea"} Dec 05 08:40:01 crc kubenswrapper[4795]: E1205 08:40:01.550924 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podUID="201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3" Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.562050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" event={"ID":"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90","Type":"ContainerStarted","Data":"ebec88dcf997e822ecd917a05f711b67fa2dc454bfda983dbe88f632c25cc6b3"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.583317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" event={"ID":"e8ef9580-dae6-4db8-aa6d-5c600b8ae507","Type":"ContainerStarted","Data":"48064f6c518375e4f15217839be7e32f41e08475a02df20c2b235553bd6e7081"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.586817 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" event={"ID":"4759b941-a4a1-470a-99e1-9acc898804e9","Type":"ContainerStarted","Data":"5e3cce31022415ed8cf998a9692c95bd2ddc1616b61a488a877bf7d47aed6a16"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.625746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" event={"ID":"6c624ff1-59b4-4d7a-af4c-0dd48235842b","Type":"ContainerStarted","Data":"f2cc639e9c25115bd66f35a650ecec9939f66dfeb20a6e85592c14a22e7dd088"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.663927 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" event={"ID":"df2401aa-47d5-4301-93ea-41a8c8b32cc9","Type":"ContainerStarted","Data":"51952b00e1b36bef7856cee806dbbe51b914b2e83b36fce2e602b4dac7297617"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.689626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" event={"ID":"4d769f5c-b4d9-4049-9cf8-73d02b343b1f","Type":"ContainerStarted","Data":"3a3f8be1bc01931b9e3aebf2cd09b7bc9fb6ae50c8048c539cdc7ed5918a1989"} Dec 05 08:40:01 crc kubenswrapper[4795]: I1205 08:40:01.720854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" event={"ID":"75cb47f2-6153-4f9b-9634-151793360092","Type":"ContainerStarted","Data":"4287b9bc4571e4d634a487672f9888fef99971c121e9cd45b42677e05ede056b"} Dec 05 08:40:02 crc kubenswrapper[4795]: I1205 08:40:02.122943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:02 crc kubenswrapper[4795]: I1205 08:40:02.123006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:02 crc kubenswrapper[4795]: E1205 08:40:02.123157 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:40:02 crc kubenswrapper[4795]: E1205 08:40:02.123222 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:06.123202351 +0000 UTC m=+957.695806090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:40:02 crc kubenswrapper[4795]: E1205 08:40:02.123833 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 08:40:02 crc kubenswrapper[4795]: E1205 08:40:02.123939 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:06.1239154 +0000 UTC m=+957.696519139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "metrics-server-cert" not found Dec 05 08:40:02 crc kubenswrapper[4795]: E1205 08:40:02.739187 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podUID="201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3" Dec 05 08:40:04 crc kubenswrapper[4795]: I1205 08:40:04.491538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:04 crc kubenswrapper[4795]: E1205 08:40:04.492245 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:04 crc kubenswrapper[4795]: E1205 08:40:04.492429 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:12.49240883 +0000 UTC m=+964.065012559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:05 crc kubenswrapper[4795]: I1205 08:40:05.522206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:05 crc kubenswrapper[4795]: E1205 08:40:05.522477 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:05 crc kubenswrapper[4795]: E1205 08:40:05.522628 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:13.522583895 +0000 UTC m=+965.095187634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:06 crc kubenswrapper[4795]: I1205 08:40:06.137214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:06 crc kubenswrapper[4795]: I1205 08:40:06.137693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:06 crc kubenswrapper[4795]: E1205 08:40:06.137483 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 05 08:40:06 crc kubenswrapper[4795]: E1205 08:40:06.137845 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:14.137813955 +0000 UTC m=+965.710417694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "metrics-server-cert" not found Dec 05 08:40:06 crc kubenswrapper[4795]: E1205 08:40:06.137901 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:40:06 crc kubenswrapper[4795]: E1205 08:40:06.137967 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:14.137947738 +0000 UTC m=+965.710551477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:40:12 crc kubenswrapper[4795]: I1205 08:40:12.551038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:12 crc kubenswrapper[4795]: E1205 08:40:12.551384 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:12 crc kubenswrapper[4795]: E1205 08:40:12.552273 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert podName:2deba92d-4689-450c-95e7-36cb8fc196c1 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:28.552236481 +0000 UTC m=+980.124840400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert") pod "infra-operator-controller-manager-57548d458d-fb4c2" (UID: "2deba92d-4689-450c-95e7-36cb8fc196c1") : secret "infra-operator-webhook-server-cert" not found Dec 05 08:40:13 crc kubenswrapper[4795]: I1205 08:40:13.570361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:13 crc kubenswrapper[4795]: E1205 08:40:13.570579 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:13 crc kubenswrapper[4795]: E1205 08:40:13.570891 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert podName:60a49846-77cd-440b-b8b2-988cd340dd18 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:29.570868291 +0000 UTC m=+981.143472030 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" (UID: "60a49846-77cd-440b-b8b2-988cd340dd18") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 05 08:40:14 crc kubenswrapper[4795]: I1205 08:40:14.184002 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:14 crc kubenswrapper[4795]: I1205 08:40:14.184078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:14 crc kubenswrapper[4795]: E1205 08:40:14.184226 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 05 08:40:14 crc kubenswrapper[4795]: E1205 08:40:14.184307 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs podName:b993e6ee-cadd-4671-99dd-bb54433c0064 nodeName:}" failed. No retries permitted until 2025-12-05 08:40:30.184285691 +0000 UTC m=+981.756889430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs") pod "openstack-operator-controller-manager-8c7b64495-p2lwl" (UID: "b993e6ee-cadd-4671-99dd-bb54433c0064") : secret "webhook-server-cert" not found Dec 05 08:40:14 crc kubenswrapper[4795]: I1205 08:40:14.191255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-metrics-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:17 crc kubenswrapper[4795]: E1205 08:40:17.393465 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 05 08:40:17 crc kubenswrapper[4795]: E1205 08:40:17.394122 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwg92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-jcbss_openstack-operators(4d920ea1-76ae-4bb3-831f-e83ac4d57fbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:18 crc kubenswrapper[4795]: E1205 08:40:18.321829 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 05 08:40:18 crc kubenswrapper[4795]: E1205 08:40:18.322584 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hq457,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-j6sk2_openstack-operators(3bb0c684-dce6-453f-b3ba-184b11da37c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:23 crc kubenswrapper[4795]: E1205 08:40:23.573649 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 05 08:40:23 crc kubenswrapper[4795]: E1205 08:40:23.575067 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kg5nf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-stfxh_openstack-operators(6c624ff1-59b4-4d7a-af4c-0dd48235842b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:25 crc kubenswrapper[4795]: E1205 08:40:25.078232 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 05 08:40:25 crc kubenswrapper[4795]: E1205 08:40:25.079170 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dmxl5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-s6td2_openstack-operators(67fad932-d045-4ee7-ae85-bf528a431eb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:26 crc kubenswrapper[4795]: E1205 08:40:26.981931 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 05 08:40:26 crc kubenswrapper[4795]: E1205 08:40:26.982203 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmgxw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-mjbz5_openstack-operators(4759b941-a4a1-470a-99e1-9acc898804e9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:27 crc kubenswrapper[4795]: E1205 08:40:27.663948 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 05 08:40:27 crc kubenswrapper[4795]: E1205 08:40:27.664716 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zmjpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-xbwwz_openstack-operators(4d769f5c-b4d9-4049-9cf8-73d02b343b1f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:28 crc kubenswrapper[4795]: E1205 08:40:28.222232 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 05 08:40:28 crc kubenswrapper[4795]: E1205 08:40:28.222481 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcwbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-m6snn_openstack-operators(e8acf865-8373-4a37-ba22-bc276e596f2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:28 crc kubenswrapper[4795]: I1205 08:40:28.614914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:28 crc kubenswrapper[4795]: I1205 08:40:28.629089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2deba92d-4689-450c-95e7-36cb8fc196c1-cert\") pod \"infra-operator-controller-manager-57548d458d-fb4c2\" (UID: \"2deba92d-4689-450c-95e7-36cb8fc196c1\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:28 crc kubenswrapper[4795]: I1205 08:40:28.833092 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bzxct" Dec 05 08:40:28 crc kubenswrapper[4795]: I1205 08:40:28.841138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:29 crc kubenswrapper[4795]: I1205 08:40:29.633217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:29 crc kubenswrapper[4795]: I1205 08:40:29.639404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60a49846-77cd-440b-b8b2-988cd340dd18-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd45xs24\" (UID: \"60a49846-77cd-440b-b8b2-988cd340dd18\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:29 crc kubenswrapper[4795]: I1205 08:40:29.660240 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n8pq2" Dec 05 08:40:29 crc kubenswrapper[4795]: I1205 08:40:29.668389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:30 crc kubenswrapper[4795]: I1205 08:40:30.242493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:30 crc kubenswrapper[4795]: I1205 08:40:30.247985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b993e6ee-cadd-4671-99dd-bb54433c0064-webhook-certs\") pod \"openstack-operator-controller-manager-8c7b64495-p2lwl\" (UID: \"b993e6ee-cadd-4671-99dd-bb54433c0064\") " pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:30 crc kubenswrapper[4795]: I1205 08:40:30.411206 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kbh7x" Dec 05 08:40:30 crc kubenswrapper[4795]: I1205 08:40:30.419079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:30 crc kubenswrapper[4795]: E1205 08:40:30.437841 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 05 08:40:30 crc kubenswrapper[4795]: E1205 08:40:30.438386 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58tjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-kvhsl_openstack-operators(6fb6884d-9a5b-40bd-bc15-d51a6a645645): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:30 crc kubenswrapper[4795]: I1205 08:40:30.441039 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:40:32 crc kubenswrapper[4795]: E1205 08:40:32.953507 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 05 08:40:32 crc kubenswrapper[4795]: E1205 08:40:32.954090 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g772v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-b9vm2_openstack-operators(61b3d615-1654-4fc5-a601-43f68103ac52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:33 crc kubenswrapper[4795]: E1205 08:40:33.513730 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 05 08:40:33 crc kubenswrapper[4795]: E1205 08:40:33.514751 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dtnkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-tkk5l_openstack-operators(63be1623-e1cd-4904-99cb-9497a6596599): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:34 crc kubenswrapper[4795]: E1205 08:40:34.114037 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 05 08:40:34 crc kubenswrapper[4795]: E1205 08:40:34.114412 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdtxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-t29gj_openstack-operators(e8ef9580-dae6-4db8-aa6d-5c600b8ae507): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:34 crc kubenswrapper[4795]: E1205 08:40:34.806501 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 05 08:40:34 crc kubenswrapper[4795]: E1205 08:40:34.807403 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ppm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-fxncz_openstack-operators(df2401aa-47d5-4301-93ea-41a8c8b32cc9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:35 crc kubenswrapper[4795]: E1205 08:40:35.355844 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 05 08:40:35 crc kubenswrapper[4795]: E1205 08:40:35.356107 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9b98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-n89xx_openstack-operators(33394c60-0058-4c0a-8582-cdd95c25bd19): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:37 crc kubenswrapper[4795]: E1205 08:40:37.214386 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 05 08:40:37 crc kubenswrapper[4795]: E1205 08:40:37.215235 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvblx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-8dh85_openstack-operators(8db4219d-3a4b-4470-9c6d-db1b98c9b3dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:37 crc kubenswrapper[4795]: E1205 08:40:37.992417 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 05 08:40:37 crc kubenswrapper[4795]: E1205 08:40:37.992642 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q6z2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-pzc2j_openstack-operators(201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:38 crc kubenswrapper[4795]: E1205 08:40:38.631294 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 05 08:40:38 crc kubenswrapper[4795]: E1205 08:40:38.631546 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c7j46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-mbkw8_openstack-operators(b41e588d-948f-4709-8717-cfbe8fbba4c9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:39 crc kubenswrapper[4795]: E1205 08:40:39.697048 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 05 08:40:39 crc kubenswrapper[4795]: E1205 08:40:39.697764 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mxxzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-g6ghd_openstack-operators(d83675df-1935-473f-925e-5b40d61aadfa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:39 crc kubenswrapper[4795]: E1205 08:40:39.699062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" podUID="d83675df-1935-473f-925e-5b40d61aadfa" Dec 05 08:40:40 crc kubenswrapper[4795]: E1205 08:40:40.149508 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" podUID="d83675df-1935-473f-925e-5b40d61aadfa" Dec 05 08:40:40 crc kubenswrapper[4795]: I1205 08:40:40.384196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24"] Dec 05 08:40:40 crc kubenswrapper[4795]: I1205 08:40:40.514576 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2"] Dec 05 08:40:40 crc kubenswrapper[4795]: I1205 08:40:40.560423 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl"] Dec 05 08:40:40 crc kubenswrapper[4795]: I1205 08:40:40.827485 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:40:40 crc kubenswrapper[4795]: I1205 08:40:40.828033 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:40:40 crc kubenswrapper[4795]: W1205 08:40:40.902356 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb993e6ee_cadd_4671_99dd_bb54433c0064.slice/crio-4fe47314f8f4fdd8155b5f7daf1cc53e07030e2363a6dc7f3a07316e0f638ae1 WatchSource:0}: Error finding container 4fe47314f8f4fdd8155b5f7daf1cc53e07030e2363a6dc7f3a07316e0f638ae1: Status 404 returned error can't find the container with id 4fe47314f8f4fdd8155b5f7daf1cc53e07030e2363a6dc7f3a07316e0f638ae1 Dec 05 08:40:41 crc kubenswrapper[4795]: I1205 08:40:41.157532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" event={"ID":"b993e6ee-cadd-4671-99dd-bb54433c0064","Type":"ContainerStarted","Data":"4fe47314f8f4fdd8155b5f7daf1cc53e07030e2363a6dc7f3a07316e0f638ae1"} Dec 05 08:40:41 crc kubenswrapper[4795]: I1205 08:40:41.160975 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" event={"ID":"75cb47f2-6153-4f9b-9634-151793360092","Type":"ContainerStarted","Data":"9b3d31956a78994385828c6442bdc7cfefc13036cd79ed85637543c59825fd73"} Dec 05 08:40:41 crc kubenswrapper[4795]: I1205 08:40:41.166669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" event={"ID":"60a49846-77cd-440b-b8b2-988cd340dd18","Type":"ContainerStarted","Data":"f7c143a549e0605b129b3f4db609c7dc0312a97c4981360829e5c9fb628063f1"} Dec 05 08:40:41 crc kubenswrapper[4795]: I1205 08:40:41.171631 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" event={"ID":"2deba92d-4689-450c-95e7-36cb8fc196c1","Type":"ContainerStarted","Data":"dcc14d602cf1d7d3f2ca03c4a2189173a1a49f17e39ab6c0c261622d009e8eea"} Dec 05 08:40:46 crc kubenswrapper[4795]: I1205 08:40:46.210849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" event={"ID":"f42b62b8-5856-4300-8bf6-b2299f1b5612","Type":"ContainerStarted","Data":"f24aec55181c794a57e5bb979a0f62ee718deb1f6fd9e1b3379415d18ca67249"} Dec 05 08:40:46 crc kubenswrapper[4795]: I1205 08:40:46.214560 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" event={"ID":"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90","Type":"ContainerStarted","Data":"921b6c7be9bdd1a60ebcffdebc6904642ba752894a9020f3a9b67b9139bdfbef"} Dec 05 08:40:47 crc kubenswrapper[4795]: I1205 08:40:47.230832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" event={"ID":"b993e6ee-cadd-4671-99dd-bb54433c0064","Type":"ContainerStarted","Data":"82bf9a129ac38fbd3bfad44fff6fa2ad5d2b5a6a98f868f61afd017a5b9854a8"} Dec 05 08:40:47 crc kubenswrapper[4795]: I1205 08:40:47.231319 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:40:47 crc kubenswrapper[4795]: I1205 08:40:47.273114 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" podStartSLOduration=49.27308035 podStartE2EDuration="49.27308035s" podCreationTimestamp="2025-12-05 08:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:40:47.267154322 +0000 UTC m=+998.839758071" watchObservedRunningTime="2025-12-05 08:40:47.27308035 +0000 UTC m=+998.845684089" Dec 05 08:40:49 crc kubenswrapper[4795]: E1205 08:40:49.978998 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 08:40:49 crc kubenswrapper[4795]: E1205 08:40:49.979578 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bwg92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-jcbss_openstack-operators(4d920ea1-76ae-4bb3-831f-e83ac4d57fbe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:49 crc kubenswrapper[4795]: E1205 08:40:49.980926 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" podUID="4d920ea1-76ae-4bb3-831f-e83ac4d57fbe" Dec 05 08:40:50 crc kubenswrapper[4795]: E1205 08:40:50.411913 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 05 08:40:50 crc kubenswrapper[4795]: E1205 08:40:50.412176 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcwbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-m6snn_openstack-operators(e8acf865-8373-4a37-ba22-bc276e596f2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:40:50 crc kubenswrapper[4795]: E1205 08:40:50.413499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" podUID="e8acf865-8373-4a37-ba22-bc276e596f2d" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.297563 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" podUID="67fad932-d045-4ee7-ae85-bf528a431eb3" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.329063 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podUID="201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.439044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" podUID="4759b941-a4a1-470a-99e1-9acc898804e9" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.476398 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" podUID="8db4219d-3a4b-4470-9c6d-db1b98c9b3dc" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.505305 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" podUID="33394c60-0058-4c0a-8582-cdd95c25bd19" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.592137 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" podUID="3bb0c684-dce6-453f-b3ba-184b11da37c8" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.860405 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" podUID="6fb6884d-9a5b-40bd-bc15-d51a6a645645" Dec 05 08:40:51 crc kubenswrapper[4795]: E1205 08:40:51.922009 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" podUID="e8ef9580-dae6-4db8-aa6d-5c600b8ae507" Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.064788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" podUID="4d769f5c-b4d9-4049-9cf8-73d02b343b1f" Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.221275 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" podUID="61b3d615-1654-4fc5-a601-43f68103ac52" Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.278896 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" podUID="6c624ff1-59b4-4d7a-af4c-0dd48235842b" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.295368 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" event={"ID":"75cb47f2-6153-4f9b-9634-151793360092","Type":"ContainerStarted","Data":"cec4b63819721a16522d6309197889560ba82e6e4c2faa462662b7a21eb7385c"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.295778 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.305493 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.308135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" event={"ID":"6fb6884d-9a5b-40bd-bc15-d51a6a645645","Type":"ContainerStarted","Data":"c4d584bb93a4384e49971c831db11630c4d99f98ee4e5b3c2e22ff2f308c65bc"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.317117 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" event={"ID":"67fad932-d045-4ee7-ae85-bf528a431eb3","Type":"ContainerStarted","Data":"6d69aa306f1d0fff60625ce54a2926cd09df73967e7aba69ff921f0b7f831088"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.335884 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-vbzr4" podStartSLOduration=4.121728041 podStartE2EDuration="54.335860392s" podCreationTimestamp="2025-12-05 08:39:58 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.633511126 +0000 UTC m=+952.206114865" lastFinishedPulling="2025-12-05 08:40:50.847643477 +0000 UTC m=+1002.420247216" observedRunningTime="2025-12-05 08:40:52.333225461 +0000 UTC m=+1003.905829200" watchObservedRunningTime="2025-12-05 08:40:52.335860392 +0000 UTC m=+1003.908464131" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.349342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" event={"ID":"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe","Type":"ContainerStarted","Data":"c0e937050a871cbcf391025c3fd543e1161251a82f66cd10de343fcb2474a0e4"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.357071 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" event={"ID":"4d769f5c-b4d9-4049-9cf8-73d02b343b1f","Type":"ContainerStarted","Data":"279e16f908d688b12560f357c0e02b682696c64cb39baa25e77d8c992eb1ba63"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.381414 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" event={"ID":"e8ef9580-dae6-4db8-aa6d-5c600b8ae507","Type":"ContainerStarted","Data":"8c0a111ed54a4eab0b16fe1fd369604fef734e16bc8d5c15fddb909a899627c8"} Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.390346 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" podUID="df2401aa-47d5-4301-93ea-41a8c8b32cc9" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.410959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" event={"ID":"4759b941-a4a1-470a-99e1-9acc898804e9","Type":"ContainerStarted","Data":"181442179e1156d7fc7a769ba00b55fa28db11be9ec3c30b5025e37d299415ca"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.437977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" event={"ID":"61b3d615-1654-4fc5-a601-43f68103ac52","Type":"ContainerStarted","Data":"897b0ab320c418b4c4ae0d27b2a62b73324515941f235b8e5a8226078f44bac0"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.448448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" event={"ID":"33394c60-0058-4c0a-8582-cdd95c25bd19","Type":"ContainerStarted","Data":"a05547fd6acf80ce93e0714df23b651c2f4f91c26f464ab09bb77544ed9bd0af"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.482475 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" event={"ID":"60a49846-77cd-440b-b8b2-988cd340dd18","Type":"ContainerStarted","Data":"94e30bde213b473c85729fd990e5db068dc4d9505be1935bf713b8c47e9e1cf6"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.502586 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" event={"ID":"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc","Type":"ContainerStarted","Data":"8e9665489a8e487587345f90d20d77d93587921ffb6af25fe9dd8cbe9e46190c"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.549872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" event={"ID":"2deba92d-4689-450c-95e7-36cb8fc196c1","Type":"ContainerStarted","Data":"98e060342a2fd4bff2d400a5c5d350355ca395a4e74609318de2c74dfc3562fb"} Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.562076 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" event={"ID":"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3","Type":"ContainerStarted","Data":"f7b567df4577c00512de245e8b8b072819863490d75b5a91c9bf21f41fa16f30"} Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.568634 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podUID="201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3" Dec 05 08:40:52 crc kubenswrapper[4795]: I1205 08:40:52.579467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" event={"ID":"3bb0c684-dce6-453f-b3ba-184b11da37c8","Type":"ContainerStarted","Data":"6e711a06d5a0aeef1ff5a836f71eb1993f1ffa5524f2385345163d6f7ef9e9cc"} Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.653161 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" podUID="b41e588d-948f-4709-8717-cfbe8fbba4c9" Dec 05 08:40:52 crc kubenswrapper[4795]: E1205 08:40:52.787813 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" podUID="63be1623-e1cd-4904-99cb-9497a6596599" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.673804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" event={"ID":"f42b62b8-5856-4300-8bf6-b2299f1b5612","Type":"ContainerStarted","Data":"46d30864423ac65c2117da043ec0cb48fa45c52a000ed0a466fef8d9d90f70a4"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.674521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.685765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.686025 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" event={"ID":"63be1623-e1cd-4904-99cb-9497a6596599","Type":"ContainerStarted","Data":"35b480dcddc2995ce757a9b89bf792680169b607f8b6edbeefaef8af4095cdb5"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.706953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" event={"ID":"b41e588d-948f-4709-8717-cfbe8fbba4c9","Type":"ContainerStarted","Data":"a21e638df51e2c4068e0f750cdb9ce1a74044dda8a11fc4f2e850d463b38c620"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.725132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" event={"ID":"60a49846-77cd-440b-b8b2-988cd340dd18","Type":"ContainerStarted","Data":"5c8c268f3db7b6e1dc959500aef5cccf3ea65fedde9bd65f14e9343423197042"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.726026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.750887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" event={"ID":"33394c60-0058-4c0a-8582-cdd95c25bd19","Type":"ContainerStarted","Data":"20cc3ff5976459a3f70052d66fee3de8425b8f91b47f7a2032ecb0d05818f6e6"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.752033 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.775080 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-fgtwh" podStartSLOduration=4.831871133 podStartE2EDuration="57.775048705s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:58.022113913 +0000 UTC m=+949.594717652" lastFinishedPulling="2025-12-05 08:40:50.965291485 +0000 UTC m=+1002.537895224" observedRunningTime="2025-12-05 08:40:53.751558216 +0000 UTC m=+1005.324161965" watchObservedRunningTime="2025-12-05 08:40:53.775048705 +0000 UTC m=+1005.347652444" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.780566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" event={"ID":"2deba92d-4689-450c-95e7-36cb8fc196c1","Type":"ContainerStarted","Data":"671236a044d950d12bc63f224991226d225a7c27988d4081cd74981e2e129222"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.781485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.795491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" event={"ID":"df2401aa-47d5-4301-93ea-41a8c8b32cc9","Type":"ContainerStarted","Data":"c41ca6002df9013b9098dd8198ee842c01303e60b9ebc51dc19041d38681c9c6"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.812685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" event={"ID":"a0f6ca90-a15b-4fd1-b934-c4428e4c0d90","Type":"ContainerStarted","Data":"1d80f0d71d9590f90e184d40bd20055dfde1bacdb3503d1ad9a2bbe68b4a2455"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.814036 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.828450 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.852939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" event={"ID":"e8acf865-8373-4a37-ba22-bc276e596f2d","Type":"ContainerStarted","Data":"34e4143de19e0f97b4b2c7774ae2419765dfb8d2f8f92c3b3f1bb905ec7b268f"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.853838 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.857137 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" podStartSLOduration=46.91775663 podStartE2EDuration="56.857114931s" podCreationTimestamp="2025-12-05 08:39:57 +0000 UTC" firstStartedPulling="2025-12-05 08:40:40.756030541 +0000 UTC m=+992.328634280" lastFinishedPulling="2025-12-05 08:40:50.695388842 +0000 UTC m=+1002.267992581" observedRunningTime="2025-12-05 08:40:53.854547312 +0000 UTC m=+1005.427151061" watchObservedRunningTime="2025-12-05 08:40:53.857114931 +0000 UTC m=+1005.429718670" Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.893787 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" event={"ID":"6c624ff1-59b4-4d7a-af4c-0dd48235842b","Type":"ContainerStarted","Data":"78f22c6c5a7903cf0c45b35de48ded8189ee9a1e3913b67a84c05ac09a6dcefc"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.896828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" event={"ID":"4d920ea1-76ae-4bb3-831f-e83ac4d57fbe","Type":"ContainerStarted","Data":"afe1304be78aa0bd0e1e16847a7f27011a80275acd77632c7754322f3cb0e7ca"} Dec 05 08:40:53 crc kubenswrapper[4795]: I1205 08:40:53.896864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.002120 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" podStartSLOduration=4.24486253 podStartE2EDuration="57.002086541s" podCreationTimestamp="2025-12-05 08:39:57 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.232493548 +0000 UTC m=+951.805097287" lastFinishedPulling="2025-12-05 08:40:52.989717559 +0000 UTC m=+1004.562321298" observedRunningTime="2025-12-05 08:40:53.998797693 +0000 UTC m=+1005.571401432" watchObservedRunningTime="2025-12-05 08:40:54.002086541 +0000 UTC m=+1005.574690280" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.043919 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" podStartSLOduration=5.744269397 podStartE2EDuration="58.04389752s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:58.66409829 +0000 UTC m=+950.236702029" lastFinishedPulling="2025-12-05 08:40:50.963726413 +0000 UTC m=+1002.536330152" observedRunningTime="2025-12-05 08:40:54.040245632 +0000 UTC m=+1005.612849371" watchObservedRunningTime="2025-12-05 08:40:54.04389752 +0000 UTC m=+1005.616501259" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.085535 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-mjmbn" podStartSLOduration=6.43215472 podStartE2EDuration="57.085496603s" podCreationTimestamp="2025-12-05 08:39:57 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.463680052 +0000 UTC m=+952.036283791" lastFinishedPulling="2025-12-05 08:40:51.117021935 +0000 UTC m=+1002.689625674" observedRunningTime="2025-12-05 08:40:54.07491092 +0000 UTC m=+1005.647514689" watchObservedRunningTime="2025-12-05 08:40:54.085496603 +0000 UTC m=+1005.658100342" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.241392 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" podStartSLOduration=6.757620636 podStartE2EDuration="58.241368554s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.392857554 +0000 UTC m=+951.965461293" lastFinishedPulling="2025-12-05 08:40:51.876605472 +0000 UTC m=+1003.449209211" observedRunningTime="2025-12-05 08:40:54.223076705 +0000 UTC m=+1005.795680444" watchObservedRunningTime="2025-12-05 08:40:54.241368554 +0000 UTC m=+1005.813972293" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.782695 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" podStartSLOduration=48.588126729 podStartE2EDuration="58.782674529s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:40.734824804 +0000 UTC m=+992.307428543" lastFinishedPulling="2025-12-05 08:40:50.929372604 +0000 UTC m=+1002.501976343" observedRunningTime="2025-12-05 08:40:54.306168498 +0000 UTC m=+1005.878772237" watchObservedRunningTime="2025-12-05 08:40:54.782674529 +0000 UTC m=+1006.355278268" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.918443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" event={"ID":"e8acf865-8373-4a37-ba22-bc276e596f2d","Type":"ContainerStarted","Data":"9c0c8e9d37166af68fccf355406b0107e7c51d29c9a96d8ca62e1c209daa082e"} Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.949749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.956339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.964711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" event={"ID":"6fb6884d-9a5b-40bd-bc15-d51a6a645645","Type":"ContainerStarted","Data":"e7e6635af71ad83c4aa127ef8ca9561627e2251283e02a226a84ea4a6d9df058"} Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.965285 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.967712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" event={"ID":"6c624ff1-59b4-4d7a-af4c-0dd48235842b","Type":"ContainerStarted","Data":"69c226481a0cbf9769ae3890a0a2689c63489a907f464ba62949c1c84c68ebcc"} Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.968123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.977254 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.988664 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" podStartSLOduration=5.43551859 podStartE2EDuration="58.988644641s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:59.859656779 +0000 UTC m=+951.432260508" lastFinishedPulling="2025-12-05 08:40:53.41278282 +0000 UTC m=+1004.985386559" observedRunningTime="2025-12-05 08:40:54.985130678 +0000 UTC m=+1006.557734417" watchObservedRunningTime="2025-12-05 08:40:54.988644641 +0000 UTC m=+1006.561248380" Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.994420 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" event={"ID":"e8ef9580-dae6-4db8-aa6d-5c600b8ae507","Type":"ContainerStarted","Data":"82882062a2193d041f8c841d193409ea0b4ae47fc51fb874f329e42676540d5e"} Dec 05 08:40:54 crc kubenswrapper[4795]: I1205 08:40:54.995292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:40:55 crc kubenswrapper[4795]: I1205 08:40:55.070597 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" podStartSLOduration=5.733963754 podStartE2EDuration="58.070572944s" podCreationTimestamp="2025-12-05 08:39:57 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.652309328 +0000 UTC m=+952.224913067" lastFinishedPulling="2025-12-05 08:40:52.988918518 +0000 UTC m=+1004.561522257" observedRunningTime="2025-12-05 08:40:55.06591605 +0000 UTC m=+1006.638519789" watchObservedRunningTime="2025-12-05 08:40:55.070572944 +0000 UTC m=+1006.643176683" Dec 05 08:40:55 crc kubenswrapper[4795]: I1205 08:40:55.152344 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" podStartSLOduration=4.495186931 podStartE2EDuration="57.152325391s" podCreationTimestamp="2025-12-05 08:39:58 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.9921715 +0000 UTC m=+952.564775239" lastFinishedPulling="2025-12-05 08:40:53.64930995 +0000 UTC m=+1005.221913699" observedRunningTime="2025-12-05 08:40:55.149223918 +0000 UTC m=+1006.721827657" watchObservedRunningTime="2025-12-05 08:40:55.152325391 +0000 UTC m=+1006.724929130" Dec 05 08:40:55 crc kubenswrapper[4795]: I1205 08:40:55.152556 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" podStartSLOduration=5.036841527 podStartE2EDuration="58.152549657s" podCreationTimestamp="2025-12-05 08:39:57 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.374191915 +0000 UTC m=+951.946795654" lastFinishedPulling="2025-12-05 08:40:53.489900045 +0000 UTC m=+1005.062503784" observedRunningTime="2025-12-05 08:40:55.113183684 +0000 UTC m=+1006.685787423" watchObservedRunningTime="2025-12-05 08:40:55.152549657 +0000 UTC m=+1006.725153396" Dec 05 08:40:55 crc kubenswrapper[4795]: I1205 08:40:55.172075 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" podStartSLOduration=4.667062764 podStartE2EDuration="59.171899215s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:59.146199535 +0000 UTC m=+950.718803274" lastFinishedPulling="2025-12-05 08:40:53.651035986 +0000 UTC m=+1005.223639725" observedRunningTime="2025-12-05 08:40:55.168145125 +0000 UTC m=+1006.740748864" watchObservedRunningTime="2025-12-05 08:40:55.171899215 +0000 UTC m=+1006.744502954" Dec 05 08:40:55 crc kubenswrapper[4795]: I1205 08:40:55.220034 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" podStartSLOduration=6.6029609879999995 podStartE2EDuration="59.220014513s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.373504327 +0000 UTC m=+951.946108066" lastFinishedPulling="2025-12-05 08:40:52.990557852 +0000 UTC m=+1004.563161591" observedRunningTime="2025-12-05 08:40:55.214959278 +0000 UTC m=+1006.787563017" watchObservedRunningTime="2025-12-05 08:40:55.220014513 +0000 UTC m=+1006.792618252" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.035848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" event={"ID":"8db4219d-3a4b-4470-9c6d-db1b98c9b3dc","Type":"ContainerStarted","Data":"e43cf7b522b54b21c9aae3f3b5ebc0e2ddfcbfea4c2928fdc9967ea5dd19746c"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.038791 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.043237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" event={"ID":"61b3d615-1654-4fc5-a601-43f68103ac52","Type":"ContainerStarted","Data":"ed16b95ab59bf6e264b04d30bcf82cfc24734cd845eb2267b24bca3161f1533d"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.044698 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.063895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" event={"ID":"67fad932-d045-4ee7-ae85-bf528a431eb3","Type":"ContainerStarted","Data":"fde939f9764c595e41b27e308717e5eaa474c14ff467737ed3cbd896e50cdea8"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.065053 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.076277 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" event={"ID":"4759b941-a4a1-470a-99e1-9acc898804e9","Type":"ContainerStarted","Data":"f32866051fd8e1878452f6964727e50bd987f8aa6ca2caedf23ee1f4f6385c23"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.087023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" event={"ID":"4d769f5c-b4d9-4049-9cf8-73d02b343b1f","Type":"ContainerStarted","Data":"6b725a41e0c7e62af2fc51ce9a3fc0d731d1baf620be1f3a4a7323339bf4c45b"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.103692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" event={"ID":"63be1623-e1cd-4904-99cb-9497a6596599","Type":"ContainerStarted","Data":"707d3d6d77bb7dd086db188de59fcc05e7300f96dc83eac47c57b5f4d7c1edf2"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.104517 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.120992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" event={"ID":"3bb0c684-dce6-453f-b3ba-184b11da37c8","Type":"ContainerStarted","Data":"32a82a87d5cd965b5ff1bef74e1a27937edf62c8bcfbb4e165aee27c814d0ced"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.128589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" event={"ID":"b41e588d-948f-4709-8717-cfbe8fbba4c9","Type":"ContainerStarted","Data":"ac7be48fafcf285d96dcca532500c7d92ec0c64c254e11164d7131ffad9af2cb"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.129621 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.132980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" event={"ID":"d83675df-1935-473f-925e-5b40d61aadfa","Type":"ContainerStarted","Data":"42450959c001e63ed0c04f7e29cbfbc9e7de5f3816a717dce6babb9cf1b4c55b"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.142938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" event={"ID":"df2401aa-47d5-4301-93ea-41a8c8b32cc9","Type":"ContainerStarted","Data":"2a3b813539594650871d95491b28ee3775c20c9f32f53fc7b214deb20ec792ec"} Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.142976 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.145550 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" podStartSLOduration=7.200817657 podStartE2EDuration="1m0.14553374s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.545590192 +0000 UTC m=+952.118193931" lastFinishedPulling="2025-12-05 08:40:53.490306275 +0000 UTC m=+1005.062910014" observedRunningTime="2025-12-05 08:40:56.07603809 +0000 UTC m=+1007.648641829" watchObservedRunningTime="2025-12-05 08:40:56.14553374 +0000 UTC m=+1007.718137469" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.145951 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" podStartSLOduration=6.421106246 podStartE2EDuration="1m0.145945361s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:59.90782899 +0000 UTC m=+951.480432719" lastFinishedPulling="2025-12-05 08:40:53.632668095 +0000 UTC m=+1005.205271834" observedRunningTime="2025-12-05 08:40:56.141033709 +0000 UTC m=+1007.713637448" watchObservedRunningTime="2025-12-05 08:40:56.145945361 +0000 UTC m=+1007.718549100" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.156911 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-fb4c2" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.191379 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" podStartSLOduration=6.056221353 podStartE2EDuration="1m0.191355306s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:39:58.854683369 +0000 UTC m=+950.427287108" lastFinishedPulling="2025-12-05 08:40:52.989817322 +0000 UTC m=+1004.562421061" observedRunningTime="2025-12-05 08:40:56.185687844 +0000 UTC m=+1007.758291573" watchObservedRunningTime="2025-12-05 08:40:56.191355306 +0000 UTC m=+1007.763959035" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.239475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" podStartSLOduration=5.624223289 podStartE2EDuration="1m0.239451593s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.616688908 +0000 UTC m=+952.189292647" lastFinishedPulling="2025-12-05 08:40:55.231917212 +0000 UTC m=+1006.804520951" observedRunningTime="2025-12-05 08:40:56.232120997 +0000 UTC m=+1007.804724736" watchObservedRunningTime="2025-12-05 08:40:56.239451593 +0000 UTC m=+1007.812055332" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.358847 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g6ghd" podStartSLOduration=3.821983162 podStartE2EDuration="58.358826578s" podCreationTimestamp="2025-12-05 08:39:58 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.704998361 +0000 UTC m=+952.277602100" lastFinishedPulling="2025-12-05 08:40:55.241841787 +0000 UTC m=+1006.814445516" observedRunningTime="2025-12-05 08:40:56.281371555 +0000 UTC m=+1007.853975294" watchObservedRunningTime="2025-12-05 08:40:56.358826578 +0000 UTC m=+1007.931430317" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.360566 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" podStartSLOduration=5.925283898 podStartE2EDuration="1m0.360559384s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.536755921 +0000 UTC m=+952.109359660" lastFinishedPulling="2025-12-05 08:40:54.972031407 +0000 UTC m=+1006.544635146" observedRunningTime="2025-12-05 08:40:56.330739746 +0000 UTC m=+1007.903343485" watchObservedRunningTime="2025-12-05 08:40:56.360559384 +0000 UTC m=+1007.933163123" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.389298 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" podStartSLOduration=5.564254257 podStartE2EDuration="1m0.389273593s" podCreationTimestamp="2025-12-05 08:39:56 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.274518843 +0000 UTC m=+951.847122582" lastFinishedPulling="2025-12-05 08:40:55.099538179 +0000 UTC m=+1006.672141918" observedRunningTime="2025-12-05 08:40:56.387193197 +0000 UTC m=+1007.959796936" watchObservedRunningTime="2025-12-05 08:40:56.389273593 +0000 UTC m=+1007.961877332" Dec 05 08:40:56 crc kubenswrapper[4795]: I1205 08:40:56.640591 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" Dec 05 08:40:57 crc kubenswrapper[4795]: I1205 08:40:57.552659 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" Dec 05 08:40:59 crc kubenswrapper[4795]: I1205 08:40:59.674451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd45xs24" Dec 05 08:41:00 crc kubenswrapper[4795]: I1205 08:41:00.425632 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-8c7b64495-p2lwl" Dec 05 08:41:06 crc kubenswrapper[4795]: I1205 08:41:06.661537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-s6td2" Dec 05 08:41:06 crc kubenswrapper[4795]: I1205 08:41:06.782156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-kvhsl" Dec 05 08:41:06 crc kubenswrapper[4795]: I1205 08:41:06.984338 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-j6sk2" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.010770 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-t29gj" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.060231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-b9vm2" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.180728 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-8dh85" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.275539 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.550877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-n89xx" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.633598 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" Dec 05 08:41:07 crc kubenswrapper[4795]: I1205 08:41:07.712503 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-fxncz" Dec 05 08:41:08 crc kubenswrapper[4795]: I1205 08:41:08.118798 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-mjbz5" Dec 05 08:41:08 crc kubenswrapper[4795]: I1205 08:41:08.717594 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-stfxh" Dec 05 08:41:08 crc kubenswrapper[4795]: I1205 08:41:08.768325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-xbwwz" Dec 05 08:41:09 crc kubenswrapper[4795]: I1205 08:41:09.277917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" event={"ID":"201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3","Type":"ContainerStarted","Data":"40182a7212ec553aca7cf4b64eddbbc3ec6e063ea6a8b21cf135a238f5f4b395"} Dec 05 08:41:09 crc kubenswrapper[4795]: I1205 08:41:09.278164 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:41:09 crc kubenswrapper[4795]: I1205 08:41:09.316874 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" podStartSLOduration=4.113171988 podStartE2EDuration="1m11.31684985s" podCreationTimestamp="2025-12-05 08:39:58 +0000 UTC" firstStartedPulling="2025-12-05 08:40:00.992587991 +0000 UTC m=+952.565191730" lastFinishedPulling="2025-12-05 08:41:08.196265853 +0000 UTC m=+1019.768869592" observedRunningTime="2025-12-05 08:41:09.297266056 +0000 UTC m=+1020.869869795" watchObservedRunningTime="2025-12-05 08:41:09.31684985 +0000 UTC m=+1020.889453589" Dec 05 08:41:10 crc kubenswrapper[4795]: I1205 08:41:10.827947 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:41:10 crc kubenswrapper[4795]: I1205 08:41:10.828460 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:41:18 crc kubenswrapper[4795]: I1205 08:41:18.566367 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-pzc2j" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.897437 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.901428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.912470 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.912696 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-n4btk" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.912727 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.918352 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.920162 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.946533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.946635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk52n\" (UniqueName: \"kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.996469 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:41:35 crc kubenswrapper[4795]: I1205 08:41:35.998001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.006442 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.039307 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.048521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.048630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.048695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.048751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk52n\" (UniqueName: \"kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.048821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58qh\" (UniqueName: \"kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.050886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.122741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk52n\" (UniqueName: \"kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n\") pod \"dnsmasq-dns-675f4bcbfc-j52js\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.150695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58qh\" (UniqueName: \"kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.150786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.150847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.151981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.152037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.168493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58qh\" (UniqueName: \"kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh\") pod \"dnsmasq-dns-78dd6ddcc-hzq8h\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.226741 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.314150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.764818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:41:36 crc kubenswrapper[4795]: I1205 08:41:36.868481 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:41:37 crc kubenswrapper[4795]: I1205 08:41:37.517413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" event={"ID":"85663bec-4131-4032-ae97-3ad346ea96ec","Type":"ContainerStarted","Data":"cee92f6a029c2150365b8ab5f2c46a599873c428aa55f5cbb6f6f0c290d064ec"} Dec 05 08:41:37 crc kubenswrapper[4795]: I1205 08:41:37.520034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" event={"ID":"58475528-6db5-4fb0-bcee-67d276358d8b","Type":"ContainerStarted","Data":"4c970e068f63cf71e21d7b07d06e31eaba88276b45b6494cb39dbc418a4b270c"} Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.167762 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.221631 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.223423 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.248808 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.409896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.410149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjzl\" (UniqueName: \"kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.421370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.522948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.523025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjzl\" (UniqueName: \"kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.523071 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.524070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.525773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.559207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjzl\" (UniqueName: \"kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl\") pod \"dnsmasq-dns-666b6646f7-9gb88\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.682047 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.722811 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.726905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.792455 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.833782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.833881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.833935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtb5\" (UniqueName: \"kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.857346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.935624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.935703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.935750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vtb5\" (UniqueName: \"kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.938298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:39 crc kubenswrapper[4795]: I1205 08:41:39.938912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.013059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vtb5\" (UniqueName: \"kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5\") pod \"dnsmasq-dns-57d769cc4f-fvs5f\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.080463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.412583 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.414077 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.418957 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.419942 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q5tjc" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.420113 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.420236 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.423870 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.424042 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.424234 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.462103 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472265 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf66k\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.472519 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583711 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf66k\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.583997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.584023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.584042 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.584065 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.585305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.585526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.585719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.585887 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.597185 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.647829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.648847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.649211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.670495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.671014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.689099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf66k\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.715797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.771972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.830091 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.830178 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.830240 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.833202 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.835957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.836030 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5" gracePeriod=600 Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.963260 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.965569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xtb2s" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977433 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977674 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977865 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.977980 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 08:41:40 crc kubenswrapper[4795]: I1205 08:41:40.978093 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.001895 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095649 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095736 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095805 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095854 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfsr4\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.095920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.121947 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197141 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197543 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfsr4\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197650 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.197872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.199004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.199268 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.199696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.211235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.215251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.216761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.225091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.226496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfsr4\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.227322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.227929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.228098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.270100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.339646 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.695688 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:41:41 crc kubenswrapper[4795]: W1205 08:41:41.734144 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956aa512_9ab5_4c74_863b_3ed2a14535d9.slice/crio-8bc2af43fb858b496565f4d8af07cf291445ddf66d4ddf4d09526c7d558b60c7 WatchSource:0}: Error finding container 8bc2af43fb858b496565f4d8af07cf291445ddf66d4ddf4d09526c7d558b60c7: Status 404 returned error can't find the container with id 8bc2af43fb858b496565f4d8af07cf291445ddf66d4ddf4d09526c7d558b60c7 Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.739535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" event={"ID":"b716ddfa-bbff-444b-bed7-275b451068bf","Type":"ContainerStarted","Data":"707387eacf588c96ef19127d41ec8c752928761605241c7c68dfc2e3cdd959ef"} Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.800939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" event={"ID":"5ee704d2-2665-4802-874e-3a0e7573e39d","Type":"ContainerStarted","Data":"1e696cf7c6035448233c8839d8b31fac42764067cefff289cd5b1aab0c1ab5b4"} Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.822066 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:41:41 crc kubenswrapper[4795]: W1205 08:41:41.832622 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8515f5_24b3_4930_9df2_90c25e2f8e6e.slice/crio-f42edc77403d08cb5c45460eb21d61a862047365e32e409475770ff85828eb70 WatchSource:0}: Error finding container f42edc77403d08cb5c45460eb21d61a862047365e32e409475770ff85828eb70: Status 404 returned error can't find the container with id f42edc77403d08cb5c45460eb21d61a862047365e32e409475770ff85828eb70 Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.834071 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5" exitCode=0 Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.834121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5"} Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.834156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9"} Dec 05 08:41:41 crc kubenswrapper[4795]: I1205 08:41:41.834178 4795 scope.go:117] "RemoveContainer" containerID="5cfc8d950f452da6a1e1434084528e4b3072305c3ebf7fe0ef0d6483a3606312" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.102200 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.105141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.117419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2xfls" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.118740 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.131413 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.141026 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.141590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.172772 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.243853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbxbt\" (UniqueName: \"kubernetes.io/projected/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kube-api-access-xbxbt\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.243918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.243949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kolla-config\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.244006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.244038 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.244062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.244090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-default\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.244108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345461 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-default\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345512 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbxbt\" (UniqueName: \"kubernetes.io/projected/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kube-api-access-xbxbt\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.345598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kolla-config\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.347021 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kolla-config\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.348417 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.348675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-config-data-default\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.348732 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.349239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.384672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.392132 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.392687 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbxbt\" (UniqueName: \"kubernetes.io/projected/e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6-kube-api-access-xbxbt\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.451242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6\") " pod="openstack/openstack-galera-0" Dec 05 08:41:42 crc kubenswrapper[4795]: I1205 08:41:42.761447 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.092591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerStarted","Data":"f42edc77403d08cb5c45460eb21d61a862047365e32e409475770ff85828eb70"} Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.109237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerStarted","Data":"8bc2af43fb858b496565f4d8af07cf291445ddf66d4ddf4d09526c7d558b60c7"} Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.497016 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.509207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.530812 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.531156 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.531234 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.531438 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.531529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9nr94" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.577792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b24k\" (UniqueName: \"kubernetes.io/projected/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kube-api-access-8b24k\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.578211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.683432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b24k\" (UniqueName: \"kubernetes.io/projected/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kube-api-access-8b24k\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.684900 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.687298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.693020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.695773 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.709884 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/09a55d95-050f-4262-9bb4-7dc81ae6ea34-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.729823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.731892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a55d95-050f-4262-9bb4-7dc81ae6ea34-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.768482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b24k\" (UniqueName: \"kubernetes.io/projected/09a55d95-050f-4262-9bb4-7dc81ae6ea34-kube-api-access-8b24k\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.826876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"09a55d95-050f-4262-9bb4-7dc81ae6ea34\") " pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:43 crc kubenswrapper[4795]: I1205 08:41:43.902273 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.274072 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.275705 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.295165 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vj4lb" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.295451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.296105 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.325734 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.347434 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.406673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-config-data\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.406788 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-kolla-config\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.406812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.406855 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.406917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kdms\" (UniqueName: \"kubernetes.io/projected/468214b1-1b8a-4714-a2b5-9913dead10a6-kube-api-access-8kdms\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: W1205 08:41:44.459949 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8c2d84c_15c8_48c5_a0d2_ed17cb2c09a6.slice/crio-a53139b27debae150765f5e823da3a4eef5437e0f1b979020fa8d1082d21f390 WatchSource:0}: Error finding container a53139b27debae150765f5e823da3a4eef5437e0f1b979020fa8d1082d21f390: Status 404 returned error can't find the container with id a53139b27debae150765f5e823da3a4eef5437e0f1b979020fa8d1082d21f390 Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.513125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-kolla-config\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.513170 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.513215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.513259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kdms\" (UniqueName: \"kubernetes.io/projected/468214b1-1b8a-4714-a2b5-9913dead10a6-kube-api-access-8kdms\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.513304 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-config-data\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.515494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-config-data\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.516334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/468214b1-1b8a-4714-a2b5-9913dead10a6-kolla-config\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.529766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-memcached-tls-certs\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.538906 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468214b1-1b8a-4714-a2b5-9913dead10a6-combined-ca-bundle\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.545403 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kdms\" (UniqueName: \"kubernetes.io/projected/468214b1-1b8a-4714-a2b5-9913dead10a6-kube-api-access-8kdms\") pod \"memcached-0\" (UID: \"468214b1-1b8a-4714-a2b5-9913dead10a6\") " pod="openstack/memcached-0" Dec 05 08:41:44 crc kubenswrapper[4795]: I1205 08:41:44.646743 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 05 08:41:45 crc kubenswrapper[4795]: I1205 08:41:45.215305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 05 08:41:45 crc kubenswrapper[4795]: I1205 08:41:45.229059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6","Type":"ContainerStarted","Data":"a53139b27debae150765f5e823da3a4eef5437e0f1b979020fa8d1082d21f390"} Dec 05 08:41:45 crc kubenswrapper[4795]: I1205 08:41:45.316046 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.222176 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.223593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:41:46 crc kubenswrapper[4795]: W1205 08:41:46.238048 4795 reflector.go:561] object-"openstack"/"telemetry-ceilometer-dockercfg-tbrv2": failed to list *v1.Secret: secrets "telemetry-ceilometer-dockercfg-tbrv2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 05 08:41:46 crc kubenswrapper[4795]: E1205 08:41:46.238601 4795 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"telemetry-ceilometer-dockercfg-tbrv2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"telemetry-ceilometer-dockercfg-tbrv2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.276579 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.302779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txz62\" (UniqueName: \"kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62\") pod \"kube-state-metrics-0\" (UID: \"817d20b1-4cfa-4cae-98ae-cf2e4f379726\") " pod="openstack/kube-state-metrics-0" Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.405172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txz62\" (UniqueName: \"kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62\") pod \"kube-state-metrics-0\" (UID: \"817d20b1-4cfa-4cae-98ae-cf2e4f379726\") " pod="openstack/kube-state-metrics-0" Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.440650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txz62\" (UniqueName: \"kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62\") pod \"kube-state-metrics-0\" (UID: \"817d20b1-4cfa-4cae-98ae-cf2e4f379726\") " pod="openstack/kube-state-metrics-0" Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.478957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"468214b1-1b8a-4714-a2b5-9913dead10a6","Type":"ContainerStarted","Data":"fac1c3f3d4c450912b8aa604821bd7ae45929b5e6202666b10614d754e4fe791"} Dec 05 08:41:46 crc kubenswrapper[4795]: I1205 08:41:46.519751 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"09a55d95-050f-4262-9bb4-7dc81ae6ea34","Type":"ContainerStarted","Data":"6239306ed3e45134084b09478522df7ed772657c18b8a6e1a141081d8da6804e"} Dec 05 08:41:47 crc kubenswrapper[4795]: I1205 08:41:47.579324 4795 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/kube-state-metrics-0" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 05 08:41:47 crc kubenswrapper[4795]: I1205 08:41:47.579436 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:41:47 crc kubenswrapper[4795]: I1205 08:41:47.736830 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tbrv2" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.792966 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:41:48 crc kubenswrapper[4795]: W1205 08:41:48.810584 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817d20b1_4cfa_4cae_98ae_cf2e4f379726.slice/crio-93de357d1f047405b0144d31e5f9c7ef9d27f9d87dcd954a20fcfcd0690b5228 WatchSource:0}: Error finding container 93de357d1f047405b0144d31e5f9c7ef9d27f9d87dcd954a20fcfcd0690b5228: Status 404 returned error can't find the container with id 93de357d1f047405b0144d31e5f9c7ef9d27f9d87dcd954a20fcfcd0690b5228 Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.903287 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pbgkm"] Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.910987 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.926149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.926483 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.926581 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rl4rf" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.957597 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm"] Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-ovn-controller-tls-certs\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979428 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-combined-ca-bundle\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-log-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979714 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rxq\" (UniqueName: \"kubernetes.io/projected/ec90f56f-9ed8-4175-9736-6e0f07d7078f-kube-api-access-t2rxq\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.979848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec90f56f-9ed8-4175-9736-6e0f07d7078f-scripts\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.980060 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5vnnk"] Dec 05 08:41:48 crc kubenswrapper[4795]: I1205 08:41:48.983604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.029291 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5vnnk"] Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.081865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.081968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-ovn-controller-tls-certs\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082072 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-combined-ca-bundle\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-log-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/168f746c-97aa-4a6b-9e54-d365580aad3e-scripts\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rxq\" (UniqueName: \"kubernetes.io/projected/ec90f56f-9ed8-4175-9736-6e0f07d7078f-kube-api-access-t2rxq\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-etc-ovs\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec90f56f-9ed8-4175-9736-6e0f07d7078f-scripts\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-lib\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wzc\" (UniqueName: \"kubernetes.io/projected/168f746c-97aa-4a6b-9e54-d365580aad3e-kube-api-access-j2wzc\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-log\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.082580 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-run\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.083564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.083722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-run\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.085396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ec90f56f-9ed8-4175-9736-6e0f07d7078f-var-log-ovn\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.087199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec90f56f-9ed8-4175-9736-6e0f07d7078f-scripts\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.104119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-ovn-controller-tls-certs\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.105448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec90f56f-9ed8-4175-9736-6e0f07d7078f-combined-ca-bundle\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.122407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rxq\" (UniqueName: \"kubernetes.io/projected/ec90f56f-9ed8-4175-9736-6e0f07d7078f-kube-api-access-t2rxq\") pod \"ovn-controller-pbgkm\" (UID: \"ec90f56f-9ed8-4175-9736-6e0f07d7078f\") " pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/168f746c-97aa-4a6b-9e54-d365580aad3e-scripts\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-etc-ovs\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-lib\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192308 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wzc\" (UniqueName: \"kubernetes.io/projected/168f746c-97aa-4a6b-9e54-d365580aad3e-kube-api-access-j2wzc\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-log\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-run\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.192775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-run\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.194506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-lib\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.194716 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-etc-ovs\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.194816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/168f746c-97aa-4a6b-9e54-d365580aad3e-var-log\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.197303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/168f746c-97aa-4a6b-9e54-d365580aad3e-scripts\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.232725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wzc\" (UniqueName: \"kubernetes.io/projected/168f746c-97aa-4a6b-9e54-d365580aad3e-kube-api-access-j2wzc\") pod \"ovn-controller-ovs-5vnnk\" (UID: \"168f746c-97aa-4a6b-9e54-d365580aad3e\") " pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.249170 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.319321 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:41:49 crc kubenswrapper[4795]: I1205 08:41:49.896897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"817d20b1-4cfa-4cae-98ae-cf2e4f379726","Type":"ContainerStarted","Data":"93de357d1f047405b0144d31e5f9c7ef9d27f9d87dcd954a20fcfcd0690b5228"} Dec 05 08:41:50 crc kubenswrapper[4795]: I1205 08:41:50.162628 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm"] Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.019757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm" event={"ID":"ec90f56f-9ed8-4175-9736-6e0f07d7078f","Type":"ContainerStarted","Data":"c3ea7f0360890071d9457f1fa2af1d11027d0f184c7c9bc7f448499bd2876150"} Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.184240 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5vnnk"] Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.497191 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.502556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.509844 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2d956" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.509972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.510013 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.510077 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.510248 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.528842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.701713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.701772 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-config\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5vrh\" (UniqueName: \"kubernetes.io/projected/90e5bcaa-7346-4ac2-bb1b-453e46dec234-kube-api-access-f5vrh\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702681 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702745 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.702796 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805485 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-config\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5vrh\" (UniqueName: \"kubernetes.io/projected/90e5bcaa-7346-4ac2-bb1b-453e46dec234-kube-api-access-f5vrh\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.805831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.807953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.808031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-config\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.808316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e5bcaa-7346-4ac2-bb1b-453e46dec234-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.815199 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.816579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.820504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.828725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e5bcaa-7346-4ac2-bb1b-453e46dec234-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.839497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5vrh\" (UniqueName: \"kubernetes.io/projected/90e5bcaa-7346-4ac2-bb1b-453e46dec234-kube-api-access-f5vrh\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:51 crc kubenswrapper[4795]: I1205 08:41:51.840645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"90e5bcaa-7346-4ac2-bb1b-453e46dec234\") " pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.049869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5vnnk" event={"ID":"168f746c-97aa-4a6b-9e54-d365580aad3e","Type":"ContainerStarted","Data":"aac9de9c3e57c19f7de444f31c85b0422193e4928dca5f13cc8cd8f0f9ff011f"} Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.138314 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.916964 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-842lt"] Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.918552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.924350 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 05 08:41:52 crc kubenswrapper[4795]: I1205 08:41:52.934720 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-842lt"] Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.042981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e19a64f-4ae5-4731-98e0-dfef56849949-config\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.043753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovs-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.043810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576mp\" (UniqueName: \"kubernetes.io/projected/4e19a64f-4ae5-4731-98e0-dfef56849949-kube-api-access-576mp\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.043849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovn-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.043913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.043948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-combined-ca-bundle\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-combined-ca-bundle\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e19a64f-4ae5-4731-98e0-dfef56849949-config\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovs-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576mp\" (UniqueName: \"kubernetes.io/projected/4e19a64f-4ae5-4731-98e0-dfef56849949-kube-api-access-576mp\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.145913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovn-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.146420 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovn-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.151062 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e19a64f-4ae5-4731-98e0-dfef56849949-config\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.151266 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4e19a64f-4ae5-4731-98e0-dfef56849949-ovs-rundir\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.161160 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.161288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e19a64f-4ae5-4731-98e0-dfef56849949-combined-ca-bundle\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.205460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576mp\" (UniqueName: \"kubernetes.io/projected/4e19a64f-4ae5-4731-98e0-dfef56849949-kube-api-access-576mp\") pod \"ovn-controller-metrics-842lt\" (UID: \"4e19a64f-4ae5-4731-98e0-dfef56849949\") " pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:53 crc kubenswrapper[4795]: I1205 08:41:53.260369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-842lt" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.041469 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.044386 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.047378 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.048295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.050043 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m95nr" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.050065 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.063250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7b5\" (UniqueName: \"kubernetes.io/projected/770d6237-f0f8-4646-9df7-85f07fa9f48b-kube-api-access-9f7b5\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-config\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.175798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277541 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7b5\" (UniqueName: \"kubernetes.io/projected/770d6237-f0f8-4646-9df7-85f07fa9f48b-kube-api-access-9f7b5\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-config\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.277652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.280172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.280554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.281793 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.282188 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/770d6237-f0f8-4646-9df7-85f07fa9f48b-config\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.284870 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.285526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.291487 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/770d6237-f0f8-4646-9df7-85f07fa9f48b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.305016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7b5\" (UniqueName: \"kubernetes.io/projected/770d6237-f0f8-4646-9df7-85f07fa9f48b-kube-api-access-9f7b5\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.325312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"770d6237-f0f8-4646-9df7-85f07fa9f48b\") " pod="openstack/ovsdbserver-sb-0" Dec 05 08:41:54 crc kubenswrapper[4795]: I1205 08:41:54.423449 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 05 08:42:08 crc kubenswrapper[4795]: E1205 08:42:08.288387 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 08:42:08 crc kubenswrapper[4795]: E1205 08:42:08.289665 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8b24k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(09a55d95-050f-4262-9bb4-7dc81ae6ea34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:08 crc kubenswrapper[4795]: E1205 08:42:08.290915 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" Dec 05 08:42:08 crc kubenswrapper[4795]: E1205 08:42:08.315336 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" Dec 05 08:42:09 crc kubenswrapper[4795]: E1205 08:42:09.382474 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 08:42:09 crc kubenswrapper[4795]: E1205 08:42:09.383300 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf66k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(956aa512-9ab5-4c74-863b-3ed2a14535d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:09 crc kubenswrapper[4795]: E1205 08:42:09.384463 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" Dec 05 08:42:10 crc kubenswrapper[4795]: E1205 08:42:10.213111 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 05 08:42:10 crc kubenswrapper[4795]: E1205 08:42:10.213357 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n69h54bhc7hb5h6bh648h574h98h654h78h5fdh667hf8h659h67bh6h647h57h595h5b7h64h66chfch596h656h5ch648h546h58chcfh649hbfq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8kdms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(468214b1-1b8a-4714-a2b5-9913dead10a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:10 crc kubenswrapper[4795]: E1205 08:42:10.215318 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="468214b1-1b8a-4714-a2b5-9913dead10a6" Dec 05 08:42:10 crc kubenswrapper[4795]: E1205 08:42:10.330459 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="468214b1-1b8a-4714-a2b5-9913dead10a6" Dec 05 08:42:10 crc kubenswrapper[4795]: E1205 08:42:10.330517 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.639866 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.640523 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d8h5f7h646h657h578hfdh77h556h57fh57dh5c7h554h5f6h55fh655h689h65ch5b4h674h669h5ch9ch577h5h558h5f9h645h575h58chf8hcfh5b9q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t2rxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-pbgkm_openstack(ec90f56f-9ed8-4175-9736-6e0f07d7078f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.641671 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-pbgkm" podUID="ec90f56f-9ed8-4175-9736-6e0f07d7078f" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.655505 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.655787 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nfsr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(ec8515f5-24b3-4930-9df2-90c25e2f8e6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.657000 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.677367 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.677565 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbxbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:11 crc kubenswrapper[4795]: E1205 08:42:11.679175 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" Dec 05 08:42:12 crc kubenswrapper[4795]: E1205 08:42:12.353231 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" Dec 05 08:42:12 crc kubenswrapper[4795]: E1205 08:42:12.353604 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-pbgkm" podUID="ec90f56f-9ed8-4175-9736-6e0f07d7078f" Dec 05 08:42:12 crc kubenswrapper[4795]: E1205 08:42:12.354982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.930947 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.931840 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vtb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-fvs5f_openstack(5ee704d2-2665-4802-874e-3a0e7573e39d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.934040 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" podUID="5ee704d2-2665-4802-874e-3a0e7573e39d" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.950530 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.950773 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m58qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-hzq8h_openstack(58475528-6db5-4fb0-bcee-67d276358d8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.951987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" podUID="58475528-6db5-4fb0-bcee-67d276358d8b" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.964017 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.964226 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jk52n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-j52js_openstack(85663bec-4131-4032-ae97-3ad346ea96ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.966481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" podUID="85663bec-4131-4032-ae97-3ad346ea96ec" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.997889 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.998297 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mxjzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-9gb88_openstack(b716ddfa-bbff-444b-bed7-275b451068bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:42:17 crc kubenswrapper[4795]: E1205 08:42:17.999710 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" Dec 05 08:42:18 crc kubenswrapper[4795]: E1205 08:42:18.429279 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" Dec 05 08:42:18 crc kubenswrapper[4795]: E1205 08:42:18.429963 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" podUID="5ee704d2-2665-4802-874e-3a0e7573e39d" Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.461749 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.499527 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-842lt"] Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.706951 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.858404 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.977893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config\") pod \"85663bec-4131-4032-ae97-3ad346ea96ec\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.978085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk52n\" (UniqueName: \"kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n\") pod \"85663bec-4131-4032-ae97-3ad346ea96ec\" (UID: \"85663bec-4131-4032-ae97-3ad346ea96ec\") " Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.980293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config" (OuterVolumeSpecName: "config") pod "85663bec-4131-4032-ae97-3ad346ea96ec" (UID: "85663bec-4131-4032-ae97-3ad346ea96ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:18 crc kubenswrapper[4795]: I1205 08:42:18.990939 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n" (OuterVolumeSpecName: "kube-api-access-jk52n") pod "85663bec-4131-4032-ae97-3ad346ea96ec" (UID: "85663bec-4131-4032-ae97-3ad346ea96ec"). InnerVolumeSpecName "kube-api-access-jk52n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.080693 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk52n\" (UniqueName: \"kubernetes.io/projected/85663bec-4131-4032-ae97-3ad346ea96ec-kube-api-access-jk52n\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.080739 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85663bec-4131-4032-ae97-3ad346ea96ec-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:19 crc kubenswrapper[4795]: E1205 08:42:19.285953 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 08:42:19 crc kubenswrapper[4795]: E1205 08:42:19.286317 4795 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 05 08:42:19 crc kubenswrapper[4795]: E1205 08:42:19.286587 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-txz62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(817d20b1-4cfa-4cae-98ae-cf2e4f379726): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:42:19 crc kubenswrapper[4795]: E1205 08:42:19.290244 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.399712 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.438011 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" event={"ID":"58475528-6db5-4fb0-bcee-67d276358d8b","Type":"ContainerDied","Data":"4c970e068f63cf71e21d7b07d06e31eaba88276b45b6494cb39dbc418a4b270c"} Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.438031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-hzq8h" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.440013 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90e5bcaa-7346-4ac2-bb1b-453e46dec234","Type":"ContainerStarted","Data":"39c56a318519fd574d9f1565232045001e7d82df1a8d8ff547cf12964a56e783"} Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.443224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"770d6237-f0f8-4646-9df7-85f07fa9f48b","Type":"ContainerStarted","Data":"17bea090637947c5bce38d84928add965a58f1c48c287fc48b662c20aa445b4d"} Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.445493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" event={"ID":"85663bec-4131-4032-ae97-3ad346ea96ec","Type":"ContainerDied","Data":"cee92f6a029c2150365b8ab5f2c46a599873c428aa55f5cbb6f6f0c290d064ec"} Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.445538 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j52js" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.447258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-842lt" event={"ID":"4e19a64f-4ae5-4731-98e0-dfef56849949","Type":"ContainerStarted","Data":"7177e910c013c02f6c7cefc02319cbf2e8bc6eb3802da876e9075e359f25e876"} Dec 05 08:42:19 crc kubenswrapper[4795]: E1205 08:42:19.453667 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.487672 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m58qh\" (UniqueName: \"kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh\") pod \"58475528-6db5-4fb0-bcee-67d276358d8b\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.487738 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc\") pod \"58475528-6db5-4fb0-bcee-67d276358d8b\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.487800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config\") pod \"58475528-6db5-4fb0-bcee-67d276358d8b\" (UID: \"58475528-6db5-4fb0-bcee-67d276358d8b\") " Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.489091 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config" (OuterVolumeSpecName: "config") pod "58475528-6db5-4fb0-bcee-67d276358d8b" (UID: "58475528-6db5-4fb0-bcee-67d276358d8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.489482 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58475528-6db5-4fb0-bcee-67d276358d8b" (UID: "58475528-6db5-4fb0-bcee-67d276358d8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.494568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh" (OuterVolumeSpecName: "kube-api-access-m58qh") pod "58475528-6db5-4fb0-bcee-67d276358d8b" (UID: "58475528-6db5-4fb0-bcee-67d276358d8b"). InnerVolumeSpecName "kube-api-access-m58qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.560945 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.569574 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j52js"] Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.591868 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m58qh\" (UniqueName: \"kubernetes.io/projected/58475528-6db5-4fb0-bcee-67d276358d8b-kube-api-access-m58qh\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.591908 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.591924 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58475528-6db5-4fb0-bcee-67d276358d8b-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.839680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:42:19 crc kubenswrapper[4795]: I1205 08:42:19.847269 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-hzq8h"] Dec 05 08:42:20 crc kubenswrapper[4795]: I1205 08:42:20.457080 4795 generic.go:334] "Generic (PLEG): container finished" podID="168f746c-97aa-4a6b-9e54-d365580aad3e" containerID="2dcfcd8879341581e067a360788054c5e0187dc4867e8e1f33616aa8768792a1" exitCode=0 Dec 05 08:42:20 crc kubenswrapper[4795]: I1205 08:42:20.457666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5vnnk" event={"ID":"168f746c-97aa-4a6b-9e54-d365580aad3e","Type":"ContainerDied","Data":"2dcfcd8879341581e067a360788054c5e0187dc4867e8e1f33616aa8768792a1"} Dec 05 08:42:20 crc kubenswrapper[4795]: I1205 08:42:20.806533 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58475528-6db5-4fb0-bcee-67d276358d8b" path="/var/lib/kubelet/pods/58475528-6db5-4fb0-bcee-67d276358d8b/volumes" Dec 05 08:42:20 crc kubenswrapper[4795]: I1205 08:42:20.808744 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85663bec-4131-4032-ae97-3ad346ea96ec" path="/var/lib/kubelet/pods/85663bec-4131-4032-ae97-3ad346ea96ec/volumes" Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.542953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6","Type":"ContainerStarted","Data":"5885974cc68159384c941097500ed36bc99f19fe43a6c49e924ef089d9debcb5"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.547263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"468214b1-1b8a-4714-a2b5-9913dead10a6","Type":"ContainerStarted","Data":"8930fb9570ee82640e46c432d86ad4fec22a7a47146138e9facbe3ae09b7e797"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.547533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.550575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"09a55d95-050f-4262-9bb4-7dc81ae6ea34","Type":"ContainerStarted","Data":"4f9713660c4f43e4d1a3f865af875d98e1069efd29556140cf1e36f3079ebe6e"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.554179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-842lt" event={"ID":"4e19a64f-4ae5-4731-98e0-dfef56849949","Type":"ContainerStarted","Data":"ee8d70306896e10735c85d56a6d1cf03a8a96cd931d69c090ebc48f1909cd8c2"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.558408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5vnnk" event={"ID":"168f746c-97aa-4a6b-9e54-d365580aad3e","Type":"ContainerStarted","Data":"fc64fc03275c67ab387418035e8c6baa5d68c999ba6f524bcea73e4957cf0eab"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.559943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90e5bcaa-7346-4ac2-bb1b-453e46dec234","Type":"ContainerStarted","Data":"d165a525912aa0409996090869e43126f2eb75db3d711f2dd9d435f5bc3815da"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.561547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"770d6237-f0f8-4646-9df7-85f07fa9f48b","Type":"ContainerStarted","Data":"dfa231b04155020b37a1ec01b9e947d1b197f79c0047501bb554dc72fa34d15b"} Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.608508 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-842lt" podStartSLOduration=27.515618929 podStartE2EDuration="37.60847208s" podCreationTimestamp="2025-12-05 08:41:52 +0000 UTC" firstStartedPulling="2025-12-05 08:42:18.645510047 +0000 UTC m=+1090.218113786" lastFinishedPulling="2025-12-05 08:42:28.738363198 +0000 UTC m=+1100.310966937" observedRunningTime="2025-12-05 08:42:29.596991965 +0000 UTC m=+1101.169595704" watchObservedRunningTime="2025-12-05 08:42:29.60847208 +0000 UTC m=+1101.181075819" Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.676005 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.207328761 podStartE2EDuration="45.675977925s" podCreationTimestamp="2025-12-05 08:41:44 +0000 UTC" firstStartedPulling="2025-12-05 08:41:45.278360457 +0000 UTC m=+1056.850964196" lastFinishedPulling="2025-12-05 08:42:28.747009621 +0000 UTC m=+1100.319613360" observedRunningTime="2025-12-05 08:42:29.671415628 +0000 UTC m=+1101.244019367" watchObservedRunningTime="2025-12-05 08:42:29.675977925 +0000 UTC m=+1101.248581674" Dec 05 08:42:29 crc kubenswrapper[4795]: I1205 08:42:29.961724 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.023439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.025067 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.041665 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.053129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.184268 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6m5j\" (UniqueName: \"kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.184352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.184421 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.184446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.252642 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.286177 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.286768 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.286801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.286866 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6m5j\" (UniqueName: \"kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.288396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.289006 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.289008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.356685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6m5j\" (UniqueName: \"kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j\") pod \"dnsmasq-dns-5bf47b49b7-nk88t\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.467704 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.469372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.481199 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.481394 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.494999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.495077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.495109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.496091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.500292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nxfn\" (UniqueName: \"kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.581208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm" event={"ID":"ec90f56f-9ed8-4175-9736-6e0f07d7078f","Type":"ContainerStarted","Data":"72e48220242edd5e605659b7262bf59211e2cec27c86297930ff8edf17443c1b"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.581793 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-pbgkm" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.604451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5vnnk" event={"ID":"168f746c-97aa-4a6b-9e54-d365580aad3e","Type":"ContainerStarted","Data":"405311bd010fc19d92c85fc1801774361c74c4173190221bc6b98315fc4de497"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.604835 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.604865 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.604941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.605233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.605260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.605292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.605342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nxfn\" (UniqueName: \"kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.607742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"90e5bcaa-7346-4ac2-bb1b-453e46dec234","Type":"ContainerStarted","Data":"ac4a58dbd15b77b6bf38e51322d1a9fe220fd167ad7114f2ad65f8b8feeb66ad"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.614104 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pbgkm" podStartSLOduration=4.056547125 podStartE2EDuration="42.614078904s" podCreationTimestamp="2025-12-05 08:41:48 +0000 UTC" firstStartedPulling="2025-12-05 08:41:50.218438836 +0000 UTC m=+1061.791042565" lastFinishedPulling="2025-12-05 08:42:28.775970595 +0000 UTC m=+1100.348574344" observedRunningTime="2025-12-05 08:42:30.611372774 +0000 UTC m=+1102.183976513" watchObservedRunningTime="2025-12-05 08:42:30.614078904 +0000 UTC m=+1102.186682643" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.616776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.617978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.618245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.619740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.620307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerStarted","Data":"1c8eefd545af59a05a444f037361105679aae6c1a607df37c24bb29c03aba3d2"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.627220 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerStarted","Data":"7ee452d101693c0da47a464215ca51c496b979b8ffd1ec947fd8d52320f0ac04"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.633025 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"770d6237-f0f8-4646-9df7-85f07fa9f48b","Type":"ContainerStarted","Data":"2aa8f27a3924723ce714b7286249391bf1f7704d7d7db2c339fdd8c7da6669f4"} Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.671010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nxfn\" (UniqueName: \"kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn\") pod \"dnsmasq-dns-8554648995-8df4h\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.673118 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.768961 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=31.438202618 podStartE2EDuration="40.768935393s" podCreationTimestamp="2025-12-05 08:41:50 +0000 UTC" firstStartedPulling="2025-12-05 08:42:19.310019385 +0000 UTC m=+1090.882623124" lastFinishedPulling="2025-12-05 08:42:28.64075216 +0000 UTC m=+1100.213355899" observedRunningTime="2025-12-05 08:42:30.673282725 +0000 UTC m=+1102.245886464" watchObservedRunningTime="2025-12-05 08:42:30.768935393 +0000 UTC m=+1102.341539132" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.790990 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.823267 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5vnnk" podStartSLOduration=16.603112798 podStartE2EDuration="42.82324388s" podCreationTimestamp="2025-12-05 08:41:48 +0000 UTC" firstStartedPulling="2025-12-05 08:41:51.842209277 +0000 UTC m=+1063.414813016" lastFinishedPulling="2025-12-05 08:42:18.062340359 +0000 UTC m=+1089.634944098" observedRunningTime="2025-12-05 08:42:30.765790693 +0000 UTC m=+1102.338394442" watchObservedRunningTime="2025-12-05 08:42:30.82324388 +0000 UTC m=+1102.395847619" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.827522 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=28.702626075 podStartE2EDuration="38.827503889s" podCreationTimestamp="2025-12-05 08:41:52 +0000 UTC" firstStartedPulling="2025-12-05 08:42:18.515129366 +0000 UTC m=+1090.087733115" lastFinishedPulling="2025-12-05 08:42:28.6400072 +0000 UTC m=+1100.212610929" observedRunningTime="2025-12-05 08:42:30.822416768 +0000 UTC m=+1102.395020507" watchObservedRunningTime="2025-12-05 08:42:30.827503889 +0000 UTC m=+1102.400107628" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.858981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.944226 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc\") pod \"5ee704d2-2665-4802-874e-3a0e7573e39d\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.944370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vtb5\" (UniqueName: \"kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5\") pod \"5ee704d2-2665-4802-874e-3a0e7573e39d\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.944437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config\") pod \"5ee704d2-2665-4802-874e-3a0e7573e39d\" (UID: \"5ee704d2-2665-4802-874e-3a0e7573e39d\") " Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.957807 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5" (OuterVolumeSpecName: "kube-api-access-2vtb5") pod "5ee704d2-2665-4802-874e-3a0e7573e39d" (UID: "5ee704d2-2665-4802-874e-3a0e7573e39d"). InnerVolumeSpecName "kube-api-access-2vtb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.996710 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ee704d2-2665-4802-874e-3a0e7573e39d" (UID: "5ee704d2-2665-4802-874e-3a0e7573e39d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:30 crc kubenswrapper[4795]: I1205 08:42:30.996984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config" (OuterVolumeSpecName: "config") pod "5ee704d2-2665-4802-874e-3a0e7573e39d" (UID: "5ee704d2-2665-4802-874e-3a0e7573e39d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.056061 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.056098 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ee704d2-2665-4802-874e-3a0e7573e39d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.056109 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vtb5\" (UniqueName: \"kubernetes.io/projected/5ee704d2-2665-4802-874e-3a0e7573e39d-kube-api-access-2vtb5\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.140521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.382775 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.419849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:31 crc kubenswrapper[4795]: W1205 08:42:31.424051 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43329d03_0f41_4c52_ab0f_c3973d35e536.slice/crio-0469c640dde0436c5e3f324141ddce77eace12a5aadc481e439ca58ae1658602 WatchSource:0}: Error finding container 0469c640dde0436c5e3f324141ddce77eace12a5aadc481e439ca58ae1658602: Status 404 returned error can't find the container with id 0469c640dde0436c5e3f324141ddce77eace12a5aadc481e439ca58ae1658602 Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.645940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" event={"ID":"5ee704d2-2665-4802-874e-3a0e7573e39d","Type":"ContainerDied","Data":"1e696cf7c6035448233c8839d8b31fac42764067cefff289cd5b1aab0c1ab5b4"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.646000 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fvs5f" Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.649403 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerID="a7b69bd72ac337f9b3de8982376982fb6ec24e7ef3b434b9fcb5d32f98dc54c5" exitCode=0 Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.649534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8df4h" event={"ID":"ac0ebab7-6587-452e-b3ed-7c6d208c637c","Type":"ContainerDied","Data":"a7b69bd72ac337f9b3de8982376982fb6ec24e7ef3b434b9fcb5d32f98dc54c5"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.649646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8df4h" event={"ID":"ac0ebab7-6587-452e-b3ed-7c6d208c637c","Type":"ContainerStarted","Data":"c2d24ff8cffe8fedea6e0dd0785072cc78796a78e4f03ec06deca4d8a3980f2e"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.665781 4795 generic.go:334] "Generic (PLEG): container finished" podID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerID="adc1dbac879a537e05f28d33ce85fb631fd397bdbc9e3bbc39a9cf80073e3b58" exitCode=0 Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.666070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" event={"ID":"43329d03-0f41-4c52-ab0f-c3973d35e536","Type":"ContainerDied","Data":"adc1dbac879a537e05f28d33ce85fb631fd397bdbc9e3bbc39a9cf80073e3b58"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.666191 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" event={"ID":"43329d03-0f41-4c52-ab0f-c3973d35e536","Type":"ContainerStarted","Data":"0469c640dde0436c5e3f324141ddce77eace12a5aadc481e439ca58ae1658602"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.668590 4795 generic.go:334] "Generic (PLEG): container finished" podID="b716ddfa-bbff-444b-bed7-275b451068bf" containerID="fec007a3874857336cc168f2256234046e89b82470c1144072b9a24413fb9ff9" exitCode=0 Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.669026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" event={"ID":"b716ddfa-bbff-444b-bed7-275b451068bf","Type":"ContainerDied","Data":"fec007a3874857336cc168f2256234046e89b82470c1144072b9a24413fb9ff9"} Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.955696 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:42:31 crc kubenswrapper[4795]: I1205 08:42:31.972399 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fvs5f"] Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.090415 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.140380 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.202741 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxjzl\" (UniqueName: \"kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl\") pod \"b716ddfa-bbff-444b-bed7-275b451068bf\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.202930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc\") pod \"b716ddfa-bbff-444b-bed7-275b451068bf\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.202992 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config\") pod \"b716ddfa-bbff-444b-bed7-275b451068bf\" (UID: \"b716ddfa-bbff-444b-bed7-275b451068bf\") " Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.208918 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl" (OuterVolumeSpecName: "kube-api-access-mxjzl") pod "b716ddfa-bbff-444b-bed7-275b451068bf" (UID: "b716ddfa-bbff-444b-bed7-275b451068bf"). InnerVolumeSpecName "kube-api-access-mxjzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.224719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config" (OuterVolumeSpecName: "config") pod "b716ddfa-bbff-444b-bed7-275b451068bf" (UID: "b716ddfa-bbff-444b-bed7-275b451068bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.225959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b716ddfa-bbff-444b-bed7-275b451068bf" (UID: "b716ddfa-bbff-444b-bed7-275b451068bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.305399 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxjzl\" (UniqueName: \"kubernetes.io/projected/b716ddfa-bbff-444b-bed7-275b451068bf-kube-api-access-mxjzl\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.305442 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.305452 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b716ddfa-bbff-444b-bed7-275b451068bf-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.679730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8df4h" event={"ID":"ac0ebab7-6587-452e-b3ed-7c6d208c637c","Type":"ContainerStarted","Data":"851941a42cf2fd0a1266f20756cb4f5c90156ec2f0adc75d9897a69d9fa0be7a"} Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.680918 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.682491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" event={"ID":"43329d03-0f41-4c52-ab0f-c3973d35e536","Type":"ContainerStarted","Data":"1e90f80964e027589bb397c0aba92a9326a71eca01bfd5a8a3ac60468729c0bf"} Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.682774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.685347 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.686807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" event={"ID":"b716ddfa-bbff-444b-bed7-275b451068bf","Type":"ContainerDied","Data":"707387eacf588c96ef19127d41ec8c752928761605241c7c68dfc2e3cdd959ef"} Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.686908 4795 scope.go:117] "RemoveContainer" containerID="fec007a3874857336cc168f2256234046e89b82470c1144072b9a24413fb9ff9" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.719579 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-8df4h" podStartSLOduration=2.719549314 podStartE2EDuration="2.719549314s" podCreationTimestamp="2025-12-05 08:42:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:42:32.716971468 +0000 UTC m=+1104.289575197" watchObservedRunningTime="2025-12-05 08:42:32.719549314 +0000 UTC m=+1104.292153053" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.757929 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" podStartSLOduration=3.7579034890000003 podStartE2EDuration="3.757903489s" podCreationTimestamp="2025-12-05 08:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:42:32.757748625 +0000 UTC m=+1104.330352374" watchObservedRunningTime="2025-12-05 08:42:32.757903489 +0000 UTC m=+1104.330507228" Dec 05 08:42:32 crc kubenswrapper[4795]: I1205 08:42:32.765670 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee704d2-2665-4802-874e-3a0e7573e39d" path="/var/lib/kubelet/pods/5ee704d2-2665-4802-874e-3a0e7573e39d/volumes" Dec 05 08:42:33 crc kubenswrapper[4795]: I1205 08:42:33.424841 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 05 08:42:33 crc kubenswrapper[4795]: I1205 08:42:33.464736 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 05 08:42:33 crc kubenswrapper[4795]: I1205 08:42:33.698290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"817d20b1-4cfa-4cae-98ae-cf2e4f379726","Type":"ContainerStarted","Data":"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1"} Dec 05 08:42:33 crc kubenswrapper[4795]: I1205 08:42:33.700433 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 08:42:33 crc kubenswrapper[4795]: I1205 08:42:33.705382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.190649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.219034 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.880374649 podStartE2EDuration="48.21900577s" podCreationTimestamp="2025-12-05 08:41:46 +0000 UTC" firstStartedPulling="2025-12-05 08:41:48.814376472 +0000 UTC m=+1060.386980211" lastFinishedPulling="2025-12-05 08:42:33.153007583 +0000 UTC m=+1104.725611332" observedRunningTime="2025-12-05 08:42:33.72792908 +0000 UTC m=+1105.300532859" watchObservedRunningTime="2025-12-05 08:42:34.21900577 +0000 UTC m=+1105.791609499" Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.462311 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.647783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.711367 4795 generic.go:334] "Generic (PLEG): container finished" podID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerID="4f9713660c4f43e4d1a3f865af875d98e1069efd29556140cf1e36f3079ebe6e" exitCode=0 Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.712334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"09a55d95-050f-4262-9bb4-7dc81ae6ea34","Type":"ContainerDied","Data":"4f9713660c4f43e4d1a3f865af875d98e1069efd29556140cf1e36f3079ebe6e"} Dec 05 08:42:34 crc kubenswrapper[4795]: I1205 08:42:34.798163 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.048191 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 05 08:42:35 crc kubenswrapper[4795]: E1205 08:42:35.049155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" containerName="init" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.049249 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" containerName="init" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.049508 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" containerName="init" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.050508 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.054122 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.055297 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.055530 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.056092 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jcsbw" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.074564 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hzlr\" (UniqueName: \"kubernetes.io/projected/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-kube-api-access-7hzlr\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172151 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172303 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-config\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.172401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-scripts\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.273903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.273999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.274045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.274115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.274176 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-config\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.274217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-scripts\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.274285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hzlr\" (UniqueName: \"kubernetes.io/projected/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-kube-api-access-7hzlr\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.275384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.276237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-scripts\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.276250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-config\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.281841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.291758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.292646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.297110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hzlr\" (UniqueName: \"kubernetes.io/projected/6d7fe8fd-0377-462d-b17e-1ec92a8d0464-kube-api-access-7hzlr\") pod \"ovn-northd-0\" (UID: \"6d7fe8fd-0377-462d-b17e-1ec92a8d0464\") " pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.367396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.733421 4795 generic.go:334] "Generic (PLEG): container finished" podID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerID="5885974cc68159384c941097500ed36bc99f19fe43a6c49e924ef089d9debcb5" exitCode=0 Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.733666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6","Type":"ContainerDied","Data":"5885974cc68159384c941097500ed36bc99f19fe43a6c49e924ef089d9debcb5"} Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.738813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"09a55d95-050f-4262-9bb4-7dc81ae6ea34","Type":"ContainerStarted","Data":"c69649181d3c5a4ebd0f724b02420d7e0d28b4911070ccfc13771cbd784b5bd3"} Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.791731 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=10.345307647 podStartE2EDuration="53.791704948s" podCreationTimestamp="2025-12-05 08:41:42 +0000 UTC" firstStartedPulling="2025-12-05 08:41:45.278323717 +0000 UTC m=+1056.850927456" lastFinishedPulling="2025-12-05 08:42:28.724721028 +0000 UTC m=+1100.297324757" observedRunningTime="2025-12-05 08:42:35.778355634 +0000 UTC m=+1107.350959373" watchObservedRunningTime="2025-12-05 08:42:35.791704948 +0000 UTC m=+1107.364308697" Dec 05 08:42:35 crc kubenswrapper[4795]: I1205 08:42:35.857354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 05 08:42:35 crc kubenswrapper[4795]: W1205 08:42:35.858972 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d7fe8fd_0377_462d_b17e_1ec92a8d0464.slice/crio-20881b77533a6c5f11ab9db64dcb74bd4e92ff19521b91ea045844616efbadd8 WatchSource:0}: Error finding container 20881b77533a6c5f11ab9db64dcb74bd4e92ff19521b91ea045844616efbadd8: Status 404 returned error can't find the container with id 20881b77533a6c5f11ab9db64dcb74bd4e92ff19521b91ea045844616efbadd8 Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.770561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d7fe8fd-0377-462d-b17e-1ec92a8d0464","Type":"ContainerStarted","Data":"20881b77533a6c5f11ab9db64dcb74bd4e92ff19521b91ea045844616efbadd8"} Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.778297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6","Type":"ContainerStarted","Data":"0f80fc818a1678d95f8657076e325f2486c96dc5376a7c5af6fc04693f6f55a6"} Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.833474 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=11.623755133 podStartE2EDuration="55.83344515s" podCreationTimestamp="2025-12-05 08:41:41 +0000 UTC" firstStartedPulling="2025-12-05 08:41:44.561037463 +0000 UTC m=+1056.133641192" lastFinishedPulling="2025-12-05 08:42:28.77072747 +0000 UTC m=+1100.343331209" observedRunningTime="2025-12-05 08:42:36.830378281 +0000 UTC m=+1108.402982020" watchObservedRunningTime="2025-12-05 08:42:36.83344515 +0000 UTC m=+1108.406048889" Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.860883 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.861271 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="dnsmasq-dns" containerID="cri-o://1e90f80964e027589bb397c0aba92a9326a71eca01bfd5a8a3ac60468729c0bf" gracePeriod=10 Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.887123 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.890302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:36 crc kubenswrapper[4795]: I1205 08:42:36.972869 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.011496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.012041 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.012110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6bq\" (UniqueName: \"kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.012163 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.012266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.113540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.113670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.113695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.113748 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6bq\" (UniqueName: \"kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.113959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.116710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.118450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.120992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.121070 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.166845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6bq\" (UniqueName: \"kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq\") pod \"dnsmasq-dns-b8fbc5445-jpnk4\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.223604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.819292 4795 generic.go:334] "Generic (PLEG): container finished" podID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerID="1e90f80964e027589bb397c0aba92a9326a71eca01bfd5a8a3ac60468729c0bf" exitCode=0 Dec 05 08:42:37 crc kubenswrapper[4795]: I1205 08:42:37.819871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" event={"ID":"43329d03-0f41-4c52-ab0f-c3973d35e536","Type":"ContainerDied","Data":"1e90f80964e027589bb397c0aba92a9326a71eca01bfd5a8a3ac60468729c0bf"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.094369 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.123686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.130261 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8z44q" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.130501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.130709 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.131010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.191301 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.224189 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.264431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-cache\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.264653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.264725 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.264991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-lock\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.265068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssg58\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-kube-api-access-ssg58\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.366496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6m5j\" (UniqueName: \"kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j\") pod \"43329d03-0f41-4c52-ab0f-c3973d35e536\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.366646 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb\") pod \"43329d03-0f41-4c52-ab0f-c3973d35e536\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.366835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config\") pod \"43329d03-0f41-4c52-ab0f-c3973d35e536\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.366886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc\") pod \"43329d03-0f41-4c52-ab0f-c3973d35e536\" (UID: \"43329d03-0f41-4c52-ab0f-c3973d35e536\") " Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367097 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-lock\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssg58\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-kube-api-access-ssg58\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-cache\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.367748 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.380333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-lock\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.381095 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c346ae47-7294-4960-b4f3-9d791c931a12-cache\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.381202 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.381225 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.381281 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:42:38.881259359 +0000 UTC m=+1110.453863088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.403738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j" (OuterVolumeSpecName: "kube-api-access-k6m5j") pod "43329d03-0f41-4c52-ab0f-c3973d35e536" (UID: "43329d03-0f41-4c52-ab0f-c3973d35e536"). InnerVolumeSpecName "kube-api-access-k6m5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.424089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssg58\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-kube-api-access-ssg58\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.445596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.470243 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6m5j\" (UniqueName: \"kubernetes.io/projected/43329d03-0f41-4c52-ab0f-c3973d35e536-kube-api-access-k6m5j\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.481859 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43329d03-0f41-4c52-ab0f-c3973d35e536" (UID: "43329d03-0f41-4c52-ab0f-c3973d35e536"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.506816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43329d03-0f41-4c52-ab0f-c3973d35e536" (UID: "43329d03-0f41-4c52-ab0f-c3973d35e536"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.514284 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config" (OuterVolumeSpecName: "config") pod "43329d03-0f41-4c52-ab0f-c3973d35e536" (UID: "43329d03-0f41-4c52-ab0f-c3973d35e536"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.555770 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:42:38 crc kubenswrapper[4795]: W1205 08:42:38.560463 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa752e5e_08d3_49db_b564_8efe2d39b1ca.slice/crio-85514fb445dc69d5b7419133eac3da44fd2d15eec44d8dd97dd714ba758479f6 WatchSource:0}: Error finding container 85514fb445dc69d5b7419133eac3da44fd2d15eec44d8dd97dd714ba758479f6: Status 404 returned error can't find the container with id 85514fb445dc69d5b7419133eac3da44fd2d15eec44d8dd97dd714ba758479f6 Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.573841 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.574079 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.574167 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43329d03-0f41-4c52-ab0f-c3973d35e536-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.834629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerStarted","Data":"d9d2ec164824abd94a54796a572c3d7786ff94286bd52cd8b9aab2d31447ff8b"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.834725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerStarted","Data":"85514fb445dc69d5b7419133eac3da44fd2d15eec44d8dd97dd714ba758479f6"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.837561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d7fe8fd-0377-462d-b17e-1ec92a8d0464","Type":"ContainerStarted","Data":"e41667bcdf578c99b876a668a78b223abd0656093ff226bdad5e824c437880d3"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.837590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6d7fe8fd-0377-462d-b17e-1ec92a8d0464","Type":"ContainerStarted","Data":"cc2872c448ee8e3102b07c0675f1a7aca68ec4efb49f7b1d73c97cfcb6344121"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.839714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.839739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-nk88t" event={"ID":"43329d03-0f41-4c52-ab0f-c3973d35e536","Type":"ContainerDied","Data":"0469c640dde0436c5e3f324141ddce77eace12a5aadc481e439ca58ae1658602"} Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.839828 4795 scope.go:117] "RemoveContainer" containerID="1e90f80964e027589bb397c0aba92a9326a71eca01bfd5a8a3ac60468729c0bf" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.883326 4795 scope.go:117] "RemoveContainer" containerID="adc1dbac879a537e05f28d33ce85fb631fd397bdbc9e3bbc39a9cf80073e3b58" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.884419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.884749 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.884780 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: E1205 08:42:38.884840 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:42:39.8848156 +0000 UTC m=+1111.457419339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.914415 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.713698253 podStartE2EDuration="3.9143758s" podCreationTimestamp="2025-12-05 08:42:35 +0000 UTC" firstStartedPulling="2025-12-05 08:42:35.86104934 +0000 UTC m=+1107.433653079" lastFinishedPulling="2025-12-05 08:42:38.061726887 +0000 UTC m=+1109.634330626" observedRunningTime="2025-12-05 08:42:38.895472954 +0000 UTC m=+1110.468076703" watchObservedRunningTime="2025-12-05 08:42:38.9143758 +0000 UTC m=+1110.486979539" Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.943135 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:38 crc kubenswrapper[4795]: I1205 08:42:38.953026 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-nk88t"] Dec 05 08:42:39 crc kubenswrapper[4795]: I1205 08:42:39.854817 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerID="d9d2ec164824abd94a54796a572c3d7786ff94286bd52cd8b9aab2d31447ff8b" exitCode=0 Dec 05 08:42:39 crc kubenswrapper[4795]: I1205 08:42:39.854903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerDied","Data":"d9d2ec164824abd94a54796a572c3d7786ff94286bd52cd8b9aab2d31447ff8b"} Dec 05 08:42:39 crc kubenswrapper[4795]: I1205 08:42:39.859772 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 05 08:42:39 crc kubenswrapper[4795]: I1205 08:42:39.906122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:39 crc kubenswrapper[4795]: E1205 08:42:39.906474 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:39 crc kubenswrapper[4795]: E1205 08:42:39.906495 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:39 crc kubenswrapper[4795]: E1205 08:42:39.906558 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:42:41.906529778 +0000 UTC m=+1113.479133517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:40 crc kubenswrapper[4795]: I1205 08:42:40.760156 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" path="/var/lib/kubelet/pods/43329d03-0f41-4c52-ab0f-c3973d35e536/volumes" Dec 05 08:42:40 crc kubenswrapper[4795]: I1205 08:42:40.860807 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:40 crc kubenswrapper[4795]: I1205 08:42:40.873394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerStarted","Data":"a9658456120763d7c91d99b658a97fd6695c0e422123fc69bcbfef0bc36839c5"} Dec 05 08:42:40 crc kubenswrapper[4795]: I1205 08:42:40.873878 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:40 crc kubenswrapper[4795]: I1205 08:42:40.925270 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" podStartSLOduration=4.925241189 podStartE2EDuration="4.925241189s" podCreationTimestamp="2025-12-05 08:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:42:40.924996733 +0000 UTC m=+1112.497600472" watchObservedRunningTime="2025-12-05 08:42:40.925241189 +0000 UTC m=+1112.497844938" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.882062 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m46d5"] Dec 05 08:42:41 crc kubenswrapper[4795]: E1205 08:42:41.882645 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="dnsmasq-dns" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.882665 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="dnsmasq-dns" Dec 05 08:42:41 crc kubenswrapper[4795]: E1205 08:42:41.882694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="init" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.882704 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="init" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.882943 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="43329d03-0f41-4c52-ab0f-c3973d35e536" containerName="dnsmasq-dns" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.883772 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.895571 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.895748 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.895763 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.902956 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m46d5"] Dec 05 08:42:41 crc kubenswrapper[4795]: I1205 08:42:41.949297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:41 crc kubenswrapper[4795]: E1205 08:42:41.949622 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:41 crc kubenswrapper[4795]: E1205 08:42:41.949669 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:41 crc kubenswrapper[4795]: E1205 08:42:41.949739 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:42:45.949712857 +0000 UTC m=+1117.522316596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.051731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rk97\" (UniqueName: \"kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.052738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.155770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.155894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rk97\" (UniqueName: \"kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.155925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.155958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.156120 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.156157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.156258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.162419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.162925 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.166210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.166646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.176343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.180414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.185812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rk97\" (UniqueName: \"kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97\") pod \"swift-ring-rebalance-m46d5\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.207327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.465153 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m46d5"] Dec 05 08:42:42 crc kubenswrapper[4795]: W1205 08:42:42.472453 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d94f47e_cb5c_427e_b529_dee69261109f.slice/crio-da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87 WatchSource:0}: Error finding container da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87: Status 404 returned error can't find the container with id da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87 Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.763254 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.763533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.844857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.892269 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m46d5" event={"ID":"7d94f47e-cb5c-427e-b529-dee69261109f","Type":"ContainerStarted","Data":"da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87"} Dec 05 08:42:42 crc kubenswrapper[4795]: I1205 08:42:42.969522 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 05 08:42:43 crc kubenswrapper[4795]: I1205 08:42:43.903681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 05 08:42:43 crc kubenswrapper[4795]: I1205 08:42:43.904328 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.007538 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.291785 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-khgjk"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.293097 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.315605 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-khgjk"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.409189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwjd2\" (UniqueName: \"kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.409507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.433698 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b449-account-create-update-8qjgb"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.443931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.446981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.458738 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b449-account-create-update-8qjgb"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.512102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.512167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78zs\" (UniqueName: \"kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.512510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.512805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwjd2\" (UniqueName: \"kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.513898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.541627 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwjd2\" (UniqueName: \"kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2\") pod \"keystone-db-create-khgjk\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.574800 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-9lvnd"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.577374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.593944 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9lvnd"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.614773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w78zs\" (UniqueName: \"kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.614965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.615796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.627936 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.651332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5042-account-create-update-zvtfw"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.652240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78zs\" (UniqueName: \"kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs\") pod \"keystone-b449-account-create-update-8qjgb\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.652920 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.656869 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.657064 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5042-account-create-update-zvtfw"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.718866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kxw\" (UniqueName: \"kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.719051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsp5\" (UniqueName: \"kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.719118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.719162 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.780123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.832840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsp5\" (UniqueName: \"kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.832985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.833069 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.833503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kxw\" (UniqueName: \"kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.835508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.835770 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.853704 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-6qbk2"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.864574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kxw\" (UniqueName: \"kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw\") pod \"placement-db-create-9lvnd\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.875522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.889491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsp5\" (UniqueName: \"kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5\") pod \"placement-5042-account-create-update-zvtfw\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.921398 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6qbk2"] Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.931458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.937116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:44 crc kubenswrapper[4795]: I1205 08:42:44.937188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxjwn\" (UniqueName: \"kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.029794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.041799 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.041863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxjwn\" (UniqueName: \"kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.048596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.066363 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-71b3-account-create-update-hwtdv"] Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.067816 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.083448 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxjwn\" (UniqueName: \"kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn\") pod \"glance-db-create-6qbk2\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.088337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-71b3-account-create-update-hwtdv"] Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.089154 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.143833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.143952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2975\" (UniqueName: \"kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.153250 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.249993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.250132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2975\" (UniqueName: \"kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.251023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.256858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.281296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2975\" (UniqueName: \"kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975\") pod \"glance-71b3-account-create-update-hwtdv\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.445899 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:45 crc kubenswrapper[4795]: I1205 08:42:45.967362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:45 crc kubenswrapper[4795]: E1205 08:42:45.967938 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:45 crc kubenswrapper[4795]: E1205 08:42:45.968204 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:45 crc kubenswrapper[4795]: E1205 08:42:45.968268 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:42:53.968246563 +0000 UTC m=+1125.540850302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.226874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.267733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-71b3-account-create-update-hwtdv"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.327240 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.327697 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-8df4h" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="dnsmasq-dns" containerID="cri-o://851941a42cf2fd0a1266f20756cb4f5c90156ec2f0adc75d9897a69d9fa0be7a" gracePeriod=10 Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.372725 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-6qbk2"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.593334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.713257 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-9lvnd"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.738001 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5042-account-create-update-zvtfw"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.750597 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b449-account-create-update-8qjgb"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.782400 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-khgjk"] Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.961481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-khgjk" event={"ID":"23b22dd0-9f53-4a9f-a431-3de3d43d1e14","Type":"ContainerStarted","Data":"1e1ee47d9cdfd184910e5eabcfdf9e608e6e38632bc3e101da6eb976854c0ad2"} Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.967939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b449-account-create-update-8qjgb" event={"ID":"c71c1756-d33b-48eb-b3e5-9298a6ba19e0","Type":"ContainerStarted","Data":"68b0c89d77a501819b01324b32ada2a2970966386b7bae345458425b49b6e5e2"} Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.973782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6qbk2" event={"ID":"b485181e-1817-4475-82e6-fa5705a822c1","Type":"ContainerStarted","Data":"8e3e086e1c10822b1d8c720927f9dc67dc6f78d281cd7c1f1f5ac8b9251a7215"} Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.976505 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5042-account-create-update-zvtfw" event={"ID":"32b0b091-d9e3-4203-a73f-bd38fe4105f8","Type":"ContainerStarted","Data":"3b2a8dab416ae67753317a637b9aebdced852fa401526f2f1eb9a9bfc4b54bb4"} Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.978436 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9lvnd" event={"ID":"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b","Type":"ContainerStarted","Data":"6732e8c9bdc0197d6b6d93a783ff6e0eb5cd804f5badcd0625f4fbdc92a0b859"} Dec 05 08:42:47 crc kubenswrapper[4795]: I1205 08:42:47.984628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71b3-account-create-update-hwtdv" event={"ID":"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07","Type":"ContainerStarted","Data":"e57c74b7c1279b322cd554cf0019767ee6a6a162f1150fa04ce3f461cb631ef8"} Dec 05 08:42:48 crc kubenswrapper[4795]: I1205 08:42:48.998414 4795 generic.go:334] "Generic (PLEG): container finished" podID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerID="851941a42cf2fd0a1266f20756cb4f5c90156ec2f0adc75d9897a69d9fa0be7a" exitCode=0 Dec 05 08:42:48 crc kubenswrapper[4795]: I1205 08:42:48.998576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8df4h" event={"ID":"ac0ebab7-6587-452e-b3ed-7c6d208c637c","Type":"ContainerDied","Data":"851941a42cf2fd0a1266f20756cb4f5c90156ec2f0adc75d9897a69d9fa0be7a"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.010282 4795 generic.go:334] "Generic (PLEG): container finished" podID="32b0b091-d9e3-4203-a73f-bd38fe4105f8" containerID="b836be2049535bb20686372de242ae824d9b525c2d02bd33b6c0105a402e063f" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.010374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5042-account-create-update-zvtfw" event={"ID":"32b0b091-d9e3-4203-a73f-bd38fe4105f8","Type":"ContainerDied","Data":"b836be2049535bb20686372de242ae824d9b525c2d02bd33b6c0105a402e063f"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.013999 4795 generic.go:334] "Generic (PLEG): container finished" podID="bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" containerID="8eaa2126e49c8d58b1b9e86420333c7fe7f751a020a577786bbb5d8cb7ed9055" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.014192 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9lvnd" event={"ID":"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b","Type":"ContainerDied","Data":"8eaa2126e49c8d58b1b9e86420333c7fe7f751a020a577786bbb5d8cb7ed9055"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.017419 4795 generic.go:334] "Generic (PLEG): container finished" podID="bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" containerID="24a16bab7e899618ec2f74f7091be078680e9df1923def1e5f1eaca842e82c1d" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.017524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71b3-account-create-update-hwtdv" event={"ID":"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07","Type":"ContainerDied","Data":"24a16bab7e899618ec2f74f7091be078680e9df1923def1e5f1eaca842e82c1d"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.019032 4795 generic.go:334] "Generic (PLEG): container finished" podID="23b22dd0-9f53-4a9f-a431-3de3d43d1e14" containerID="fa2700636dfb09e55986f1c309d9767ed73b7b1452490411afe1d84e92a716e8" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.019105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-khgjk" event={"ID":"23b22dd0-9f53-4a9f-a431-3de3d43d1e14","Type":"ContainerDied","Data":"fa2700636dfb09e55986f1c309d9767ed73b7b1452490411afe1d84e92a716e8"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.021234 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8df4h" event={"ID":"ac0ebab7-6587-452e-b3ed-7c6d208c637c","Type":"ContainerDied","Data":"c2d24ff8cffe8fedea6e0dd0785072cc78796a78e4f03ec06deca4d8a3980f2e"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.021274 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d24ff8cffe8fedea6e0dd0785072cc78796a78e4f03ec06deca4d8a3980f2e" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.024861 4795 generic.go:334] "Generic (PLEG): container finished" podID="c71c1756-d33b-48eb-b3e5-9298a6ba19e0" containerID="6cdf1e0af36fd9920382b1be51940ff8373451a22a65edd9d861f3c8be4638ae" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.024902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b449-account-create-update-8qjgb" event={"ID":"c71c1756-d33b-48eb-b3e5-9298a6ba19e0","Type":"ContainerDied","Data":"6cdf1e0af36fd9920382b1be51940ff8373451a22a65edd9d861f3c8be4638ae"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.030426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m46d5" event={"ID":"7d94f47e-cb5c-427e-b529-dee69261109f","Type":"ContainerStarted","Data":"30eb6bd0ec95f954e4988e2c1e5e4fac4f98ade8e0e3b252db7e3206c24d9263"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.034198 4795 generic.go:334] "Generic (PLEG): container finished" podID="b485181e-1817-4475-82e6-fa5705a822c1" containerID="4fd2d8dd61ec680a52d697af2f536aad4617a91d89ae6455fda03b8a472bfc6b" exitCode=0 Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.034257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6qbk2" event={"ID":"b485181e-1817-4475-82e6-fa5705a822c1","Type":"ContainerDied","Data":"4fd2d8dd61ec680a52d697af2f536aad4617a91d89ae6455fda03b8a472bfc6b"} Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.058816 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-m46d5" podStartSLOduration=4.916033902 podStartE2EDuration="9.058769449s" podCreationTimestamp="2025-12-05 08:42:41 +0000 UTC" firstStartedPulling="2025-12-05 08:42:42.478087557 +0000 UTC m=+1114.050691296" lastFinishedPulling="2025-12-05 08:42:46.620823104 +0000 UTC m=+1118.193426843" observedRunningTime="2025-12-05 08:42:50.055854734 +0000 UTC m=+1121.628458483" watchObservedRunningTime="2025-12-05 08:42:50.058769449 +0000 UTC m=+1121.631373178" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.163300 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.286977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config\") pod \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.287190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb\") pod \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.288301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb\") pod \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.288544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nxfn\" (UniqueName: \"kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn\") pod \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.288598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc\") pod \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\" (UID: \"ac0ebab7-6587-452e-b3ed-7c6d208c637c\") " Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.296848 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn" (OuterVolumeSpecName: "kube-api-access-9nxfn") pod "ac0ebab7-6587-452e-b3ed-7c6d208c637c" (UID: "ac0ebab7-6587-452e-b3ed-7c6d208c637c"). InnerVolumeSpecName "kube-api-access-9nxfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.343684 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac0ebab7-6587-452e-b3ed-7c6d208c637c" (UID: "ac0ebab7-6587-452e-b3ed-7c6d208c637c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.346871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config" (OuterVolumeSpecName: "config") pod "ac0ebab7-6587-452e-b3ed-7c6d208c637c" (UID: "ac0ebab7-6587-452e-b3ed-7c6d208c637c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.361304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac0ebab7-6587-452e-b3ed-7c6d208c637c" (UID: "ac0ebab7-6587-452e-b3ed-7c6d208c637c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.365938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac0ebab7-6587-452e-b3ed-7c6d208c637c" (UID: "ac0ebab7-6587-452e-b3ed-7c6d208c637c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.390700 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.390757 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.390768 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nxfn\" (UniqueName: \"kubernetes.io/projected/ac0ebab7-6587-452e-b3ed-7c6d208c637c-kube-api-access-9nxfn\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.390778 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.390805 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac0ebab7-6587-452e-b3ed-7c6d208c637c-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:50 crc kubenswrapper[4795]: I1205 08:42:50.438816 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.041845 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8df4h" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.086642 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.097192 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8df4h"] Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.510633 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.623094 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts\") pod \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.623594 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfsp5\" (UniqueName: \"kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5\") pod \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\" (UID: \"32b0b091-d9e3-4203-a73f-bd38fe4105f8\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.625131 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32b0b091-d9e3-4203-a73f-bd38fe4105f8" (UID: "32b0b091-d9e3-4203-a73f-bd38fe4105f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.637903 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5" (OuterVolumeSpecName: "kube-api-access-wfsp5") pod "32b0b091-d9e3-4203-a73f-bd38fe4105f8" (UID: "32b0b091-d9e3-4203-a73f-bd38fe4105f8"). InnerVolumeSpecName "kube-api-access-wfsp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.730886 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b0b091-d9e3-4203-a73f-bd38fe4105f8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.731227 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfsp5\" (UniqueName: \"kubernetes.io/projected/32b0b091-d9e3-4203-a73f-bd38fe4105f8-kube-api-access-wfsp5\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.780908 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.819180 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.831709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts\") pod \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.831765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts\") pod \"b485181e-1817-4475-82e6-fa5705a822c1\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.831828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxjwn\" (UniqueName: \"kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn\") pod \"b485181e-1817-4475-82e6-fa5705a822c1\" (UID: \"b485181e-1817-4475-82e6-fa5705a822c1\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.832048 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwjd2\" (UniqueName: \"kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2\") pod \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\" (UID: \"23b22dd0-9f53-4a9f-a431-3de3d43d1e14\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.832387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23b22dd0-9f53-4a9f-a431-3de3d43d1e14" (UID: "23b22dd0-9f53-4a9f-a431-3de3d43d1e14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.832481 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.835887 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn" (OuterVolumeSpecName: "kube-api-access-nxjwn") pod "b485181e-1817-4475-82e6-fa5705a822c1" (UID: "b485181e-1817-4475-82e6-fa5705a822c1"). InnerVolumeSpecName "kube-api-access-nxjwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.837590 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b485181e-1817-4475-82e6-fa5705a822c1" (UID: "b485181e-1817-4475-82e6-fa5705a822c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.842499 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.847181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2" (OuterVolumeSpecName: "kube-api-access-hwjd2") pod "23b22dd0-9f53-4a9f-a431-3de3d43d1e14" (UID: "23b22dd0-9f53-4a9f-a431-3de3d43d1e14"). InnerVolumeSpecName "kube-api-access-hwjd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.847339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.890303 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.941939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2975\" (UniqueName: \"kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975\") pod \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.942016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts\") pod \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.942061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2kxw\" (UniqueName: \"kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw\") pod \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.942110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts\") pod \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\" (UID: \"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.942163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w78zs\" (UniqueName: \"kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs\") pod \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\" (UID: \"c71c1756-d33b-48eb-b3e5-9298a6ba19e0\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.942210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts\") pod \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\" (UID: \"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b\") " Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.943447 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" (UID: "bf3d3a00-5afb-42a5-921c-e7e77afe8a7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.943956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" (UID: "bd94cdc2-bb5f-423f-b2d6-5e83c050cd07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.944097 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b485181e-1817-4475-82e6-fa5705a822c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.944431 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxjwn\" (UniqueName: \"kubernetes.io/projected/b485181e-1817-4475-82e6-fa5705a822c1-kube-api-access-nxjwn\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.944452 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwjd2\" (UniqueName: \"kubernetes.io/projected/23b22dd0-9f53-4a9f-a431-3de3d43d1e14-kube-api-access-hwjd2\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.945030 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c71c1756-d33b-48eb-b3e5-9298a6ba19e0" (UID: "c71c1756-d33b-48eb-b3e5-9298a6ba19e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.948005 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs" (OuterVolumeSpecName: "kube-api-access-w78zs") pod "c71c1756-d33b-48eb-b3e5-9298a6ba19e0" (UID: "c71c1756-d33b-48eb-b3e5-9298a6ba19e0"). InnerVolumeSpecName "kube-api-access-w78zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.948316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975" (OuterVolumeSpecName: "kube-api-access-q2975") pod "bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" (UID: "bd94cdc2-bb5f-423f-b2d6-5e83c050cd07"). InnerVolumeSpecName "kube-api-access-q2975". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:51 crc kubenswrapper[4795]: I1205 08:42:51.952325 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw" (OuterVolumeSpecName: "kube-api-access-w2kxw") pod "bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" (UID: "bf3d3a00-5afb-42a5-921c-e7e77afe8a7b"). InnerVolumeSpecName "kube-api-access-w2kxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047125 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w78zs\" (UniqueName: \"kubernetes.io/projected/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-kube-api-access-w78zs\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047200 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047213 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2975\" (UniqueName: \"kubernetes.io/projected/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-kube-api-access-q2975\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047223 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c71c1756-d33b-48eb-b3e5-9298a6ba19e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047233 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2kxw\" (UniqueName: \"kubernetes.io/projected/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b-kube-api-access-w2kxw\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.047244 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.070107 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-khgjk" event={"ID":"23b22dd0-9f53-4a9f-a431-3de3d43d1e14","Type":"ContainerDied","Data":"1e1ee47d9cdfd184910e5eabcfdf9e608e6e38632bc3e101da6eb976854c0ad2"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.070136 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-khgjk" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.070189 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e1ee47d9cdfd184910e5eabcfdf9e608e6e38632bc3e101da6eb976854c0ad2" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.074265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b449-account-create-update-8qjgb" event={"ID":"c71c1756-d33b-48eb-b3e5-9298a6ba19e0","Type":"ContainerDied","Data":"68b0c89d77a501819b01324b32ada2a2970966386b7bae345458425b49b6e5e2"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.074295 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b0c89d77a501819b01324b32ada2a2970966386b7bae345458425b49b6e5e2" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.074317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b449-account-create-update-8qjgb" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.077098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-6qbk2" event={"ID":"b485181e-1817-4475-82e6-fa5705a822c1","Type":"ContainerDied","Data":"8e3e086e1c10822b1d8c720927f9dc67dc6f78d281cd7c1f1f5ac8b9251a7215"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.077122 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3e086e1c10822b1d8c720927f9dc67dc6f78d281cd7c1f1f5ac8b9251a7215" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.077219 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-6qbk2" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.080788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5042-account-create-update-zvtfw" event={"ID":"32b0b091-d9e3-4203-a73f-bd38fe4105f8","Type":"ContainerDied","Data":"3b2a8dab416ae67753317a637b9aebdced852fa401526f2f1eb9a9bfc4b54bb4"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.080919 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2a8dab416ae67753317a637b9aebdced852fa401526f2f1eb9a9bfc4b54bb4" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.080806 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5042-account-create-update-zvtfw" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.082636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-9lvnd" event={"ID":"bf3d3a00-5afb-42a5-921c-e7e77afe8a7b","Type":"ContainerDied","Data":"6732e8c9bdc0197d6b6d93a783ff6e0eb5cd804f5badcd0625f4fbdc92a0b859"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.082735 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6732e8c9bdc0197d6b6d93a783ff6e0eb5cd804f5badcd0625f4fbdc92a0b859" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.082691 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-9lvnd" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.084871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-71b3-account-create-update-hwtdv" event={"ID":"bd94cdc2-bb5f-423f-b2d6-5e83c050cd07","Type":"ContainerDied","Data":"e57c74b7c1279b322cd554cf0019767ee6a6a162f1150fa04ce3f461cb631ef8"} Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.084923 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e57c74b7c1279b322cd554cf0019767ee6a6a162f1150fa04ce3f461cb631ef8" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.085046 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-71b3-account-create-update-hwtdv" Dec 05 08:42:52 crc kubenswrapper[4795]: I1205 08:42:52.767012 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" path="/var/lib/kubelet/pods/ac0ebab7-6587-452e-b3ed-7c6d208c637c/volumes" Dec 05 08:42:53 crc kubenswrapper[4795]: I1205 08:42:53.980000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:42:53 crc kubenswrapper[4795]: E1205 08:42:53.980264 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 05 08:42:53 crc kubenswrapper[4795]: E1205 08:42:53.982476 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 05 08:42:53 crc kubenswrapper[4795]: E1205 08:42:53.982577 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift podName:c346ae47-7294-4960-b4f3-9d791c931a12 nodeName:}" failed. No retries permitted until 2025-12-05 08:43:09.98254573 +0000 UTC m=+1141.555149489 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift") pod "swift-storage-0" (UID: "c346ae47-7294-4960-b4f3-9d791c931a12") : configmap "swift-ring-files" not found Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.277875 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wklbl"] Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278294 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278309 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278327 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="init" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278333 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="init" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278348 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278363 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="dnsmasq-dns" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278369 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="dnsmasq-dns" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278379 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b0b091-d9e3-4203-a73f-bd38fe4105f8" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b0b091-d9e3-4203-a73f-bd38fe4105f8" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b485181e-1817-4475-82e6-fa5705a822c1" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278404 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b485181e-1817-4475-82e6-fa5705a822c1" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278424 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71c1756-d33b-48eb-b3e5-9298a6ba19e0" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278431 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71c1756-d33b-48eb-b3e5-9298a6ba19e0" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: E1205 08:42:55.278451 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23b22dd0-9f53-4a9f-a431-3de3d43d1e14" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23b22dd0-9f53-4a9f-a431-3de3d43d1e14" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278627 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278635 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b485181e-1817-4475-82e6-fa5705a822c1" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278647 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0ebab7-6587-452e-b3ed-7c6d208c637c" containerName="dnsmasq-dns" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278660 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71c1756-d33b-48eb-b3e5-9298a6ba19e0" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278673 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="23b22dd0-9f53-4a9f-a431-3de3d43d1e14" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278680 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b0b091-d9e3-4203-a73f-bd38fe4105f8" containerName="mariadb-account-create-update" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.278691 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" containerName="mariadb-database-create" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.279356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.281983 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.291399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wklbl"] Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.291411 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wbtsl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.412599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5rvr\" (UniqueName: \"kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.412685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.412739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.412783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.514286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.514335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.514505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5rvr\" (UniqueName: \"kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.514525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.521118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.521118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.522336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.535752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5rvr\" (UniqueName: \"kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr\") pod \"glance-db-sync-wklbl\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:55 crc kubenswrapper[4795]: I1205 08:42:55.600785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wklbl" Dec 05 08:42:56 crc kubenswrapper[4795]: I1205 08:42:56.272524 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wklbl"] Dec 05 08:42:57 crc kubenswrapper[4795]: I1205 08:42:57.136700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wklbl" event={"ID":"9fc84840-1b4e-4838-8083-61c785bec8a2","Type":"ContainerStarted","Data":"6fecd68da276b5696938457b4152ba81cc328b70f14d51052639f53328b82469"} Dec 05 08:42:58 crc kubenswrapper[4795]: I1205 08:42:58.149702 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d94f47e-cb5c-427e-b529-dee69261109f" containerID="30eb6bd0ec95f954e4988e2c1e5e4fac4f98ade8e0e3b252db7e3206c24d9263" exitCode=0 Dec 05 08:42:58 crc kubenswrapper[4795]: I1205 08:42:58.149819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m46d5" event={"ID":"7d94f47e-cb5c-427e-b529-dee69261109f","Type":"ContainerDied","Data":"30eb6bd0ec95f954e4988e2c1e5e4fac4f98ade8e0e3b252db7e3206c24d9263"} Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.310460 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pbgkm" podUID="ec90f56f-9ed8-4175-9736-6e0f07d7078f" containerName="ovn-controller" probeResult="failure" output=< Dec 05 08:42:59 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 08:42:59 crc kubenswrapper[4795]: > Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.394927 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.535121 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710304 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710750 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rk97\" (UniqueName: \"kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710805 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.710848 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.711919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.712163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.712468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf\") pod \"7d94f47e-cb5c-427e-b529-dee69261109f\" (UID: \"7d94f47e-cb5c-427e-b529-dee69261109f\") " Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.713369 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.713526 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d94f47e-cb5c-427e-b529-dee69261109f-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.719039 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97" (OuterVolumeSpecName: "kube-api-access-2rk97") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "kube-api-access-2rk97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.730068 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.744156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.744219 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.753465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts" (OuterVolumeSpecName: "scripts") pod "7d94f47e-cb5c-427e-b529-dee69261109f" (UID: "7d94f47e-cb5c-427e-b529-dee69261109f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.814682 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.814727 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.814739 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rk97\" (UniqueName: \"kubernetes.io/projected/7d94f47e-cb5c-427e-b529-dee69261109f-kube-api-access-2rk97\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.814751 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d94f47e-cb5c-427e-b529-dee69261109f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:42:59 crc kubenswrapper[4795]: I1205 08:42:59.814759 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d94f47e-cb5c-427e-b529-dee69261109f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:00 crc kubenswrapper[4795]: I1205 08:43:00.185800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-m46d5" event={"ID":"7d94f47e-cb5c-427e-b529-dee69261109f","Type":"ContainerDied","Data":"da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87"} Dec 05 08:43:00 crc kubenswrapper[4795]: I1205 08:43:00.186411 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da26000c042c43965bcc9f51b3265e4c28a4357f983b7ee8f974613ad74c1d87" Dec 05 08:43:00 crc kubenswrapper[4795]: I1205 08:43:00.185872 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m46d5" Dec 05 08:43:02 crc kubenswrapper[4795]: I1205 08:43:02.227822 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerID="1c8eefd545af59a05a444f037361105679aae6c1a607df37c24bb29c03aba3d2" exitCode=0 Dec 05 08:43:02 crc kubenswrapper[4795]: I1205 08:43:02.227952 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerDied","Data":"1c8eefd545af59a05a444f037361105679aae6c1a607df37c24bb29c03aba3d2"} Dec 05 08:43:02 crc kubenswrapper[4795]: I1205 08:43:02.237343 4795 generic.go:334] "Generic (PLEG): container finished" podID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerID="7ee452d101693c0da47a464215ca51c496b979b8ffd1ec947fd8d52320f0ac04" exitCode=0 Dec 05 08:43:02 crc kubenswrapper[4795]: I1205 08:43:02.237405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerDied","Data":"7ee452d101693c0da47a464215ca51c496b979b8ffd1ec947fd8d52320f0ac04"} Dec 05 08:43:02 crc kubenswrapper[4795]: I1205 08:43:02.736728 4795 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb716ddfa-bbff-444b-bed7-275b451068bf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb716ddfa-bbff-444b-bed7-275b451068bf] : Timed out while waiting for systemd to remove kubepods-besteffort-podb716ddfa_bbff_444b_bed7_275b451068bf.slice" Dec 05 08:43:02 crc kubenswrapper[4795]: E1205 08:43:02.736887 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb716ddfa-bbff-444b-bed7-275b451068bf] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb716ddfa-bbff-444b-bed7-275b451068bf] : Timed out while waiting for systemd to remove kubepods-besteffort-podb716ddfa_bbff_444b_bed7_275b451068bf.slice" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" Dec 05 08:43:03 crc kubenswrapper[4795]: I1205 08:43:03.245779 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9gb88" Dec 05 08:43:03 crc kubenswrapper[4795]: I1205 08:43:03.323550 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:43:03 crc kubenswrapper[4795]: I1205 08:43:03.328714 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9gb88"] Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.299670 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pbgkm" podUID="ec90f56f-9ed8-4175-9736-6e0f07d7078f" containerName="ovn-controller" probeResult="failure" output=< Dec 05 08:43:04 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 08:43:04 crc kubenswrapper[4795]: > Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.378027 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5vnnk" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.610153 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pbgkm-config-ztvlm"] Dec 05 08:43:04 crc kubenswrapper[4795]: E1205 08:43:04.610813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d94f47e-cb5c-427e-b529-dee69261109f" containerName="swift-ring-rebalance" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.610843 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d94f47e-cb5c-427e-b529-dee69261109f" containerName="swift-ring-rebalance" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.611070 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d94f47e-cb5c-427e-b529-dee69261109f" containerName="swift-ring-rebalance" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.611951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.614231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.618704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm-config-ztvlm"] Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.724309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjkl6\" (UniqueName: \"kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.724405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.724597 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.724904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.725000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.725066 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.762011 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b716ddfa-bbff-444b-bed7-275b451068bf" path="/var/lib/kubelet/pods/b716ddfa-bbff-444b-bed7-275b451068bf/volumes" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.826933 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.827001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.827035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.827138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjkl6\" (UniqueName: \"kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.827193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.827244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.828134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.828137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.828157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.829292 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.831588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.866848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjkl6\" (UniqueName: \"kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6\") pod \"ovn-controller-pbgkm-config-ztvlm\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:04 crc kubenswrapper[4795]: I1205 08:43:04.933904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:09 crc kubenswrapper[4795]: I1205 08:43:09.695002 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-pbgkm" podUID="ec90f56f-9ed8-4175-9736-6e0f07d7078f" containerName="ovn-controller" probeResult="failure" output=< Dec 05 08:43:09 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 05 08:43:09 crc kubenswrapper[4795]: > Dec 05 08:43:10 crc kubenswrapper[4795]: I1205 08:43:10.036710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:43:10 crc kubenswrapper[4795]: I1205 08:43:10.068578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c346ae47-7294-4960-b4f3-9d791c931a12-etc-swift\") pod \"swift-storage-0\" (UID: \"c346ae47-7294-4960-b4f3-9d791c931a12\") " pod="openstack/swift-storage-0" Dec 05 08:43:10 crc kubenswrapper[4795]: I1205 08:43:10.355022 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.268750 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm-config-ztvlm"] Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.359789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.362506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerStarted","Data":"3154fd32dc6a1aff1527c514daea43379e594659496943260e3a099d08c27be0"} Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.362912 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.366290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerStarted","Data":"d28e7a08bec20bcef2029f17c27a0e6d302c0975fd9378fc98d20589526ddab7"} Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.367859 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.378242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-ztvlm" event={"ID":"75c418c5-3bed-42d2-b613-7824eae438e7","Type":"ContainerStarted","Data":"1588c092726d7f52c13925bacdb37b203ea10f54f63a2a305833accff430828e"} Dec 05 08:43:12 crc kubenswrapper[4795]: W1205 08:43:12.379542 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc346ae47_7294_4960_b4f3_9d791c931a12.slice/crio-d3c48601beae3065d1ca08644a15f41f648fd3ac8dd1f62a43c0d3658a2d9d59 WatchSource:0}: Error finding container d3c48601beae3065d1ca08644a15f41f648fd3ac8dd1f62a43c0d3658a2d9d59: Status 404 returned error can't find the container with id d3c48601beae3065d1ca08644a15f41f648fd3ac8dd1f62a43c0d3658a2d9d59 Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.400171 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.522094585 podStartE2EDuration="1m33.400134679s" podCreationTimestamp="2025-12-05 08:41:39 +0000 UTC" firstStartedPulling="2025-12-05 08:41:41.847076394 +0000 UTC m=+1053.419680133" lastFinishedPulling="2025-12-05 08:42:28.725116478 +0000 UTC m=+1100.297720227" observedRunningTime="2025-12-05 08:43:12.392547625 +0000 UTC m=+1143.965151364" watchObservedRunningTime="2025-12-05 08:43:12.400134679 +0000 UTC m=+1143.972738418" Dec 05 08:43:12 crc kubenswrapper[4795]: I1205 08:43:12.449757 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.42810944 podStartE2EDuration="1m33.449726844s" podCreationTimestamp="2025-12-05 08:41:39 +0000 UTC" firstStartedPulling="2025-12-05 08:41:41.755749436 +0000 UTC m=+1053.328353165" lastFinishedPulling="2025-12-05 08:42:28.77736683 +0000 UTC m=+1100.349970569" observedRunningTime="2025-12-05 08:43:12.443772381 +0000 UTC m=+1144.016376120" watchObservedRunningTime="2025-12-05 08:43:12.449726844 +0000 UTC m=+1144.022330583" Dec 05 08:43:13 crc kubenswrapper[4795]: I1205 08:43:13.392204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"d3c48601beae3065d1ca08644a15f41f648fd3ac8dd1f62a43c0d3658a2d9d59"} Dec 05 08:43:13 crc kubenswrapper[4795]: I1205 08:43:13.396087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wklbl" event={"ID":"9fc84840-1b4e-4838-8083-61c785bec8a2","Type":"ContainerStarted","Data":"6f80e22a373a0c17b426ec9109bf61ff04782a7078bef7ec3d08ea87c6f7b1b0"} Dec 05 08:43:13 crc kubenswrapper[4795]: I1205 08:43:13.400080 4795 generic.go:334] "Generic (PLEG): container finished" podID="75c418c5-3bed-42d2-b613-7824eae438e7" containerID="1b4c7de9fcb061cea40e9db8c57dfb855602c2e628cd66646181cfb67fb201e1" exitCode=0 Dec 05 08:43:13 crc kubenswrapper[4795]: I1205 08:43:13.400208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-ztvlm" event={"ID":"75c418c5-3bed-42d2-b613-7824eae438e7","Type":"ContainerDied","Data":"1b4c7de9fcb061cea40e9db8c57dfb855602c2e628cd66646181cfb67fb201e1"} Dec 05 08:43:13 crc kubenswrapper[4795]: I1205 08:43:13.414521 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wklbl" podStartSLOduration=3.047437486 podStartE2EDuration="18.414499198s" podCreationTimestamp="2025-12-05 08:42:55 +0000 UTC" firstStartedPulling="2025-12-05 08:42:56.286518791 +0000 UTC m=+1127.859122530" lastFinishedPulling="2025-12-05 08:43:11.653580513 +0000 UTC m=+1143.226184242" observedRunningTime="2025-12-05 08:43:13.413021191 +0000 UTC m=+1144.985624930" watchObservedRunningTime="2025-12-05 08:43:13.414499198 +0000 UTC m=+1144.987102937" Dec 05 08:43:14 crc kubenswrapper[4795]: I1205 08:43:14.298694 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-pbgkm" Dec 05 08:43:14 crc kubenswrapper[4795]: I1205 08:43:14.414470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"c88fd9314d6f75ade93d929e6adbe3da1404451f20dd90534a67e6cf75277ff5"} Dec 05 08:43:14 crc kubenswrapper[4795]: I1205 08:43:14.414530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"f1cf48d68aa768237969bb6277f47656e1e08a9a51712da6ac17a78624cfb1dc"} Dec 05 08:43:14 crc kubenswrapper[4795]: I1205 08:43:14.414541 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"a259c80b6da7c107d4a7dc9debdfba8988358626a93df299b8879f810c58af5c"} Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.289127 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.361971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362031 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362068 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjkl6\" (UniqueName: \"kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts\") pod \"75c418c5-3bed-42d2-b613-7824eae438e7\" (UID: \"75c418c5-3bed-42d2-b613-7824eae438e7\") " Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362842 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run" (OuterVolumeSpecName: "var-run") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.362865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.363249 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.363570 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts" (OuterVolumeSpecName: "scripts") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.384709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6" (OuterVolumeSpecName: "kube-api-access-zjkl6") pod "75c418c5-3bed-42d2-b613-7824eae438e7" (UID: "75c418c5-3bed-42d2-b613-7824eae438e7"). InnerVolumeSpecName "kube-api-access-zjkl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.433840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"cd6f34cb2f5b8fafd1855f66667464684d12b1b290f763e21411b7897677eda8"} Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.446978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-ztvlm" event={"ID":"75c418c5-3bed-42d2-b613-7824eae438e7","Type":"ContainerDied","Data":"1588c092726d7f52c13925bacdb37b203ea10f54f63a2a305833accff430828e"} Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.447301 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1588c092726d7f52c13925bacdb37b203ea10f54f63a2a305833accff430828e" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.447442 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-ztvlm" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465586 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465645 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465656 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/75c418c5-3bed-42d2-b613-7824eae438e7-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465669 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465686 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjkl6\" (UniqueName: \"kubernetes.io/projected/75c418c5-3bed-42d2-b613-7824eae438e7-kube-api-access-zjkl6\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:15 crc kubenswrapper[4795]: I1205 08:43:15.465700 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/75c418c5-3bed-42d2-b613-7824eae438e7-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.439825 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pbgkm-config-ztvlm"] Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.454331 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pbgkm-config-ztvlm"] Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.558499 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-pbgkm-config-m56pd"] Dec 05 08:43:16 crc kubenswrapper[4795]: E1205 08:43:16.559001 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c418c5-3bed-42d2-b613-7824eae438e7" containerName="ovn-config" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.559024 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c418c5-3bed-42d2-b613-7824eae438e7" containerName="ovn-config" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.559173 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c418c5-3bed-42d2-b613-7824eae438e7" containerName="ovn-config" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.559923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.563047 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.580008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm-config-m56pd"] Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsdw\" (UniqueName: \"kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585527 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.585568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.687442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsdw\" (UniqueName: \"kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.687559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.687655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688341 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.688441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.689524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.690756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.726797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsdw\" (UniqueName: \"kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw\") pod \"ovn-controller-pbgkm-config-m56pd\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.757356 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c418c5-3bed-42d2-b613-7824eae438e7" path="/var/lib/kubelet/pods/75c418c5-3bed-42d2-b613-7824eae438e7/volumes" Dec 05 08:43:16 crc kubenswrapper[4795]: I1205 08:43:16.881964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:17 crc kubenswrapper[4795]: I1205 08:43:17.399691 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-pbgkm-config-m56pd"] Dec 05 08:43:17 crc kubenswrapper[4795]: I1205 08:43:17.508729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"88519681bc390a6da920bbc6340a62cfea10f8c39be0cc60bc911e92fb77f784"} Dec 05 08:43:17 crc kubenswrapper[4795]: I1205 08:43:17.508791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"83308cf944ca897a39591d3efce890cba135a0164d3baf9617d1cdfdc7d12fc3"} Dec 05 08:43:17 crc kubenswrapper[4795]: I1205 08:43:17.508804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"cbe0eb85bbc7a088bb04490fa624e4225f77bda402db61719105bee0872ed9fa"} Dec 05 08:43:17 crc kubenswrapper[4795]: I1205 08:43:17.514632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-m56pd" event={"ID":"323deeb7-0bda-440f-b0a1-90d33f44730e","Type":"ContainerStarted","Data":"07f258d52e8e8c10915a5648904d0564df7c7959ac011bf8a495999cb1675a13"} Dec 05 08:43:18 crc kubenswrapper[4795]: I1205 08:43:18.528867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"752da13389e6aa99fcea6985553463cd47fdc639022f3ebf29940594195dde78"} Dec 05 08:43:18 crc kubenswrapper[4795]: I1205 08:43:18.530663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-m56pd" event={"ID":"323deeb7-0bda-440f-b0a1-90d33f44730e","Type":"ContainerStarted","Data":"55846ea4e6e7ef73251e4dd94a954dede2cbec9808a4b9564d739573c9c9be79"} Dec 05 08:43:18 crc kubenswrapper[4795]: I1205 08:43:18.566741 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-pbgkm-config-m56pd" podStartSLOduration=2.5667047800000002 podStartE2EDuration="2.56670478s" podCreationTimestamp="2025-12-05 08:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:18.566312079 +0000 UTC m=+1150.138915828" watchObservedRunningTime="2025-12-05 08:43:18.56670478 +0000 UTC m=+1150.139308529" Dec 05 08:43:19 crc kubenswrapper[4795]: I1205 08:43:19.542752 4795 generic.go:334] "Generic (PLEG): container finished" podID="323deeb7-0bda-440f-b0a1-90d33f44730e" containerID="55846ea4e6e7ef73251e4dd94a954dede2cbec9808a4b9564d739573c9c9be79" exitCode=0 Dec 05 08:43:19 crc kubenswrapper[4795]: I1205 08:43:19.542811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-m56pd" event={"ID":"323deeb7-0bda-440f-b0a1-90d33f44730e","Type":"ContainerDied","Data":"55846ea4e6e7ef73251e4dd94a954dede2cbec9808a4b9564d739573c9c9be79"} Dec 05 08:43:20 crc kubenswrapper[4795]: I1205 08:43:20.563643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"83185e21cc06aeaf083de8976aac7a67be1c32464068f9611ab9571b9b558e44"} Dec 05 08:43:20 crc kubenswrapper[4795]: I1205 08:43:20.565805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"cb2ee75255b08c44c264bea91b3ece3d910bd79c71e93d70f881ccc347571a9e"} Dec 05 08:43:20 crc kubenswrapper[4795]: I1205 08:43:20.565825 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"5d04d7450e5f41acf30013e21a0d1ebc7647a9af68cc5c63b696cbd862d227b0"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.074526 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.127267 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.128995 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts" (OuterVolumeSpecName: "scripts") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.129240 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.129318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run" (OuterVolumeSpecName: "var-run") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.129279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsdw\" (UniqueName: \"kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.130609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131313 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131355 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn\") pod \"323deeb7-0bda-440f-b0a1-90d33f44730e\" (UID: \"323deeb7-0bda-440f-b0a1-90d33f44730e\") " Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131812 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131833 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131862 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.131888 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.132245 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.138972 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw" (OuterVolumeSpecName: "kube-api-access-9wsdw") pod "323deeb7-0bda-440f-b0a1-90d33f44730e" (UID: "323deeb7-0bda-440f-b0a1-90d33f44730e"). InnerVolumeSpecName "kube-api-access-9wsdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.234348 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.234385 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/323deeb7-0bda-440f-b0a1-90d33f44730e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.234398 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsdw\" (UniqueName: \"kubernetes.io/projected/323deeb7-0bda-440f-b0a1-90d33f44730e-kube-api-access-9wsdw\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.234411 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/323deeb7-0bda-440f-b0a1-90d33f44730e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.345281 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.579893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"371ece95ed98edb282901c6dd55fead818741238bfec8c9d74c6c6a9299d7990"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.579955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"107f185561c68ab90f2033b7c6cd5f916b50f47fd9005553cfa8b6629242e462"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.579968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"2d1dccab079452a4ff98edfafef4a7fec690114911c2d9e7562ebee4bc041b35"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.579979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c346ae47-7294-4960-b4f3-9d791c931a12","Type":"ContainerStarted","Data":"d8989a84a93f1c741f4c60ae81e8093ad0acd253c3e00fd216a0c82d9b807707"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.581979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-pbgkm-config-m56pd" event={"ID":"323deeb7-0bda-440f-b0a1-90d33f44730e","Type":"ContainerDied","Data":"07f258d52e8e8c10915a5648904d0564df7c7959ac011bf8a495999cb1675a13"} Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.582010 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07f258d52e8e8c10915a5648904d0564df7c7959ac011bf8a495999cb1675a13" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.582077 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-pbgkm-config-m56pd" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.653560 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.188154891 podStartE2EDuration="44.65353326s" podCreationTimestamp="2025-12-05 08:42:37 +0000 UTC" firstStartedPulling="2025-12-05 08:43:12.39043423 +0000 UTC m=+1143.963037969" lastFinishedPulling="2025-12-05 08:43:19.855812599 +0000 UTC m=+1151.428416338" observedRunningTime="2025-12-05 08:43:21.639790147 +0000 UTC m=+1153.212393886" watchObservedRunningTime="2025-12-05 08:43:21.65353326 +0000 UTC m=+1153.226136999" Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.741009 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-pbgkm-config-m56pd"] Dec 05 08:43:21 crc kubenswrapper[4795]: I1205 08:43:21.754866 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-pbgkm-config-m56pd"] Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.094936 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:22 crc kubenswrapper[4795]: E1205 08:43:22.095437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323deeb7-0bda-440f-b0a1-90d33f44730e" containerName="ovn-config" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.095460 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="323deeb7-0bda-440f-b0a1-90d33f44730e" containerName="ovn-config" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.095695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="323deeb7-0bda-440f-b0a1-90d33f44730e" containerName="ovn-config" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.096780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.106177 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.143325 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.151889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.151970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.152021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cxm5\" (UniqueName: \"kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.152083 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.152118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.152167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254164 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cxm5\" (UniqueName: \"kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.254275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.255484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.255531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.255685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.256177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.256738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.280516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cxm5\" (UniqueName: \"kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5\") pod \"dnsmasq-dns-5c79d794d7-2w7h7\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.414422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:22 crc kubenswrapper[4795]: I1205 08:43:22.760905 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323deeb7-0bda-440f-b0a1-90d33f44730e" path="/var/lib/kubelet/pods/323deeb7-0bda-440f-b0a1-90d33f44730e/volumes" Dec 05 08:43:23 crc kubenswrapper[4795]: I1205 08:43:23.064856 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:23 crc kubenswrapper[4795]: I1205 08:43:23.664211 4795 generic.go:334] "Generic (PLEG): container finished" podID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerID="68acd4634fef3c7693199a79755e626884bbb5a02aea88b58dffb628e4c5a292" exitCode=0 Dec 05 08:43:23 crc kubenswrapper[4795]: I1205 08:43:23.664318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" event={"ID":"b12d7617-6ee5-4227-986e-24bc7d59ab13","Type":"ContainerDied","Data":"68acd4634fef3c7693199a79755e626884bbb5a02aea88b58dffb628e4c5a292"} Dec 05 08:43:23 crc kubenswrapper[4795]: I1205 08:43:23.665080 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" event={"ID":"b12d7617-6ee5-4227-986e-24bc7d59ab13","Type":"ContainerStarted","Data":"f08a5eccdbd16b71f7240a7340c7da3c4999c7bbbd88da36d87dfc31793bedcc"} Dec 05 08:43:24 crc kubenswrapper[4795]: I1205 08:43:24.676718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" event={"ID":"b12d7617-6ee5-4227-986e-24bc7d59ab13","Type":"ContainerStarted","Data":"68bffd95884faa6a9f67416889f9419e2cf349d11a221f0567998d18b4afefc9"} Dec 05 08:43:24 crc kubenswrapper[4795]: I1205 08:43:24.678436 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:24 crc kubenswrapper[4795]: I1205 08:43:24.704985 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" podStartSLOduration=2.704960702 podStartE2EDuration="2.704960702s" podCreationTimestamp="2025-12-05 08:43:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:24.701183934 +0000 UTC m=+1156.273787703" watchObservedRunningTime="2025-12-05 08:43:24.704960702 +0000 UTC m=+1156.277564441" Dec 05 08:43:26 crc kubenswrapper[4795]: I1205 08:43:26.705094 4795 generic.go:334] "Generic (PLEG): container finished" podID="9fc84840-1b4e-4838-8083-61c785bec8a2" containerID="6f80e22a373a0c17b426ec9109bf61ff04782a7078bef7ec3d08ea87c6f7b1b0" exitCode=0 Dec 05 08:43:26 crc kubenswrapper[4795]: I1205 08:43:26.705196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wklbl" event={"ID":"9fc84840-1b4e-4838-8083-61c785bec8a2","Type":"ContainerDied","Data":"6f80e22a373a0c17b426ec9109bf61ff04782a7078bef7ec3d08ea87c6f7b1b0"} Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.160310 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wklbl" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.288956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data\") pod \"9fc84840-1b4e-4838-8083-61c785bec8a2\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.289328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle\") pod \"9fc84840-1b4e-4838-8083-61c785bec8a2\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.289553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5rvr\" (UniqueName: \"kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr\") pod \"9fc84840-1b4e-4838-8083-61c785bec8a2\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.289734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data\") pod \"9fc84840-1b4e-4838-8083-61c785bec8a2\" (UID: \"9fc84840-1b4e-4838-8083-61c785bec8a2\") " Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.295693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr" (OuterVolumeSpecName: "kube-api-access-s5rvr") pod "9fc84840-1b4e-4838-8083-61c785bec8a2" (UID: "9fc84840-1b4e-4838-8083-61c785bec8a2"). InnerVolumeSpecName "kube-api-access-s5rvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.297237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9fc84840-1b4e-4838-8083-61c785bec8a2" (UID: "9fc84840-1b4e-4838-8083-61c785bec8a2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.324340 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fc84840-1b4e-4838-8083-61c785bec8a2" (UID: "9fc84840-1b4e-4838-8083-61c785bec8a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.354648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data" (OuterVolumeSpecName: "config-data") pod "9fc84840-1b4e-4838-8083-61c785bec8a2" (UID: "9fc84840-1b4e-4838-8083-61c785bec8a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.404357 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5rvr\" (UniqueName: \"kubernetes.io/projected/9fc84840-1b4e-4838-8083-61c785bec8a2-kube-api-access-s5rvr\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.404473 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.404488 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.404499 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc84840-1b4e-4838-8083-61c785bec8a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.739770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wklbl" event={"ID":"9fc84840-1b4e-4838-8083-61c785bec8a2","Type":"ContainerDied","Data":"6fecd68da276b5696938457b4152ba81cc328b70f14d51052639f53328b82469"} Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.739856 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fecd68da276b5696938457b4152ba81cc328b70f14d51052639f53328b82469" Dec 05 08:43:28 crc kubenswrapper[4795]: I1205 08:43:28.740011 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wklbl" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.262197 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.262507 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="dnsmasq-dns" containerID="cri-o://68bffd95884faa6a9f67416889f9419e2cf349d11a221f0567998d18b4afefc9" gracePeriod=10 Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.264451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.306947 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:43:29 crc kubenswrapper[4795]: E1205 08:43:29.307347 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc84840-1b4e-4838-8083-61c785bec8a2" containerName="glance-db-sync" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.307364 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc84840-1b4e-4838-8083-61c785bec8a2" containerName="glance-db-sync" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.307575 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc84840-1b4e-4838-8083-61c785bec8a2" containerName="glance-db-sync" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.308482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.355198 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.428936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.429029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.429083 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grg6r\" (UniqueName: \"kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.429135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.429168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.429333 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533728 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.533822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grg6r\" (UniqueName: \"kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.534686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.535262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.535485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.535841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.538812 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.571194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grg6r\" (UniqueName: \"kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r\") pod \"dnsmasq-dns-5f59b8f679-ztw96\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.672481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.755977 4795 generic.go:334] "Generic (PLEG): container finished" podID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerID="68bffd95884faa6a9f67416889f9419e2cf349d11a221f0567998d18b4afefc9" exitCode=0 Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.756464 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" event={"ID":"b12d7617-6ee5-4227-986e-24bc7d59ab13","Type":"ContainerDied","Data":"68bffd95884faa6a9f67416889f9419e2cf349d11a221f0567998d18b4afefc9"} Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.756488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" event={"ID":"b12d7617-6ee5-4227-986e-24bc7d59ab13","Type":"ContainerDied","Data":"f08a5eccdbd16b71f7240a7340c7da3c4999c7bbbd88da36d87dfc31793bedcc"} Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.756502 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08a5eccdbd16b71f7240a7340c7da3c4999c7bbbd88da36d87dfc31793bedcc" Dec 05 08:43:29 crc kubenswrapper[4795]: I1205 08:43:29.797928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843455 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cxm5\" (UniqueName: \"kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.843886 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config\") pod \"b12d7617-6ee5-4227-986e-24bc7d59ab13\" (UID: \"b12d7617-6ee5-4227-986e-24bc7d59ab13\") " Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.876501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5" (OuterVolumeSpecName: "kube-api-access-4cxm5") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "kube-api-access-4cxm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.947694 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cxm5\" (UniqueName: \"kubernetes.io/projected/b12d7617-6ee5-4227-986e-24bc7d59ab13-kube-api-access-4cxm5\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.956513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config" (OuterVolumeSpecName: "config") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.957729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.966452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.967709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:29.970587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b12d7617-6ee5-4227-986e-24bc7d59ab13" (UID: "b12d7617-6ee5-4227-986e-24bc7d59ab13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.049743 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.049780 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.049794 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.049805 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.049818 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12d7617-6ee5-4227-986e-24bc7d59ab13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.772511 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2w7h7" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.801954 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.833834 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:30 crc kubenswrapper[4795]: I1205 08:43:30.846993 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2w7h7"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.160958 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.354327 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2mxzn"] Dec 05 08:43:31 crc kubenswrapper[4795]: E1205 08:43:31.356719 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="init" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.356743 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="init" Dec 05 08:43:31 crc kubenswrapper[4795]: E1205 08:43:31.356778 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="dnsmasq-dns" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.356785 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="dnsmasq-dns" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.357360 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" containerName="dnsmasq-dns" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.359421 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.365373 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.433065 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2mxzn"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.511108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggvqb\" (UniqueName: \"kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.511197 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.576413 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-7zmc2"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.577994 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.603061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7zmc2"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.616234 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggvqb\" (UniqueName: \"kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.616554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.617848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.638700 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a5de-account-create-update-wzl2s"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.640201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.647387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.684689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggvqb\" (UniqueName: \"kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb\") pod \"cinder-db-create-2mxzn\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.717703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5de-account-create-update-wzl2s"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.720604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.720936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqf4j\" (UniqueName: \"kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.818369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.822417 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b3d5-account-create-update-cmx25"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.823775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.823848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxmz\" (UniqueName: \"kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.823909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqf4j\" (UniqueName: \"kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.823953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.823988 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.824715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.827885 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.844538 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b3d5-account-create-update-cmx25"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.867708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqf4j\" (UniqueName: \"kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j\") pod \"barbican-db-create-7zmc2\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.888929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" event={"ID":"b6edcee3-faa8-4de5-8a83-d0dd6803844a","Type":"ContainerStarted","Data":"7ce6d361889f8f87d0548b559345046389ab975b8ff2508e7fd13c663e0beff4"} Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.900639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.926567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.926656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.927021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxmz\" (UniqueName: \"kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.927069 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwghj\" (UniqueName: \"kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.928706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.976777 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxmz\" (UniqueName: \"kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz\") pod \"barbican-a5de-account-create-update-wzl2s\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.983434 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7bt8k"] Dec 05 08:43:31 crc kubenswrapper[4795]: I1205 08:43:31.991513 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.013113 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7bt8k"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.030572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.031070 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwghj\" (UniqueName: \"kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.032463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.087691 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-w5h4x"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.089323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.091704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwghj\" (UniqueName: \"kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj\") pod \"cinder-b3d5-account-create-update-cmx25\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.099992 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.100387 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.106941 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c75xn" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.107247 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.115742 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5h4x"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.132724 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8dh\" (UniqueName: \"kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.132832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.138953 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9f8c-account-create-update-rc7kk"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.147873 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.149497 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.152225 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.176906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f8c-account-create-update-rc7kk"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.236211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.243145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblj7\" (UniqueName: \"kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.243410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.245295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8dh\" (UniqueName: \"kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.245460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.247103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.259310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.276499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8dh\" (UniqueName: \"kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh\") pod \"neutron-db-create-7bt8k\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.329730 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.354302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.355123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-884qs\" (UniqueName: \"kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.355167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.355260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.355312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblj7\" (UniqueName: \"kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.359654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.373475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.397169 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblj7\" (UniqueName: \"kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7\") pod \"keystone-db-sync-w5h4x\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.425549 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.466759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-884qs\" (UniqueName: \"kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.466947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.468983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.506742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-884qs\" (UniqueName: \"kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs\") pod \"neutron-9f8c-account-create-update-rc7kk\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.736395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2mxzn"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.777114 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.809047 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12d7617-6ee5-4227-986e-24bc7d59ab13" path="/var/lib/kubelet/pods/b12d7617-6ee5-4227-986e-24bc7d59ab13/volumes" Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.810833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-7zmc2"] Dec 05 08:43:32 crc kubenswrapper[4795]: I1205 08:43:32.880030 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b3d5-account-create-update-cmx25"] Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.061138 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerID="2bacd66b57a0eb329290b8d086ff0982116af038de1537aad1db866f36a93dc1" exitCode=0 Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.061246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" event={"ID":"b6edcee3-faa8-4de5-8a83-d0dd6803844a","Type":"ContainerDied","Data":"2bacd66b57a0eb329290b8d086ff0982116af038de1537aad1db866f36a93dc1"} Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.075432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7zmc2" event={"ID":"267818a6-5bea-4056-b190-a38341b02b4c","Type":"ContainerStarted","Data":"fe2aba9d226a61a2122f91379d935f784b758b246d770c0589689309bd858c15"} Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.082496 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2mxzn" event={"ID":"1e2b20c8-28b6-4f45-825b-0452d27fa54a","Type":"ContainerStarted","Data":"47e6b4ad81484f0f2b28374f30fb63be6fa54578fd8d89c49e1fbf72ab80a7e0"} Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.127348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3d5-account-create-update-cmx25" event={"ID":"e18679a7-6c6c-464c-92f0-893963f6a994","Type":"ContainerStarted","Data":"f7914e05f5f88be87b675053329fa618afabadfa7fb87b83892a88080537fd3f"} Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.393545 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a5de-account-create-update-wzl2s"] Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.411414 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-w5h4x"] Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.639876 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7bt8k"] Dec 05 08:43:33 crc kubenswrapper[4795]: W1205 08:43:33.679971 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47061433_b600_4907_8baf_77eb23065955.slice/crio-d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec WatchSource:0}: Error finding container d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec: Status 404 returned error can't find the container with id d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.693888 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 08:43:33 crc kubenswrapper[4795]: I1205 08:43:33.773585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f8c-account-create-update-rc7kk"] Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.149941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" event={"ID":"b6edcee3-faa8-4de5-8a83-d0dd6803844a","Type":"ContainerStarted","Data":"538a6868476e70c73ffd2e7f712d1babf028ebe8555d570de98a39598dc05838"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.152918 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.159947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5de-account-create-update-wzl2s" event={"ID":"b6c781e8-96e9-48d0-b837-7d63996efd39","Type":"ContainerStarted","Data":"f0c25b1d105b59ba6a5dd6779b27ecfcd56bc24f7e6bc3cec55ea1b2ed63ffb4"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.160823 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8c-account-create-update-rc7kk" event={"ID":"3340f491-333a-4403-8be5-0010ad46ece2","Type":"ContainerStarted","Data":"81b0be054fdea158dfc717c57a1985138560b5b578dc8403b2fdffd91d210e4f"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.164757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7zmc2" event={"ID":"267818a6-5bea-4056-b190-a38341b02b4c","Type":"ContainerStarted","Data":"a840631bdd0a34a1e1595c6a48fc21300a16140e8d0dabb6cde42a48126b8bb8"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.172897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2mxzn" event={"ID":"1e2b20c8-28b6-4f45-825b-0452d27fa54a","Type":"ContainerStarted","Data":"fd5427a200add60d5a335aa938c2d568a67b57e25bfc600f2cb942814225659f"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.179820 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podStartSLOduration=5.179789013 podStartE2EDuration="5.179789013s" podCreationTimestamp="2025-12-05 08:43:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:34.178435087 +0000 UTC m=+1165.751038826" watchObservedRunningTime="2025-12-05 08:43:34.179789013 +0000 UTC m=+1165.752392752" Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.192765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5h4x" event={"ID":"3ff5addc-f5af-4fac-b0ff-d5f34d238e69","Type":"ContainerStarted","Data":"ac6368d37d97d9b097682ebbe5e3ad3cd0e2974d3daa0e6834d3cf2717cc1621"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.225063 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3d5-account-create-update-cmx25" event={"ID":"e18679a7-6c6c-464c-92f0-893963f6a994","Type":"ContainerStarted","Data":"6b44e81296b76a731cca8a672c14c4685f5d418e47ecd16251cbe2ac0016fc53"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.238065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7bt8k" event={"ID":"47061433-b600-4907-8baf-77eb23065955","Type":"ContainerStarted","Data":"d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec"} Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.250857 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-2mxzn" podStartSLOduration=3.250827229 podStartE2EDuration="3.250827229s" podCreationTimestamp="2025-12-05 08:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:34.238138069 +0000 UTC m=+1165.810741808" watchObservedRunningTime="2025-12-05 08:43:34.250827229 +0000 UTC m=+1165.823430968" Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.261756 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-7zmc2" podStartSLOduration=3.261734702 podStartE2EDuration="3.261734702s" podCreationTimestamp="2025-12-05 08:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:34.218562524 +0000 UTC m=+1165.791166263" watchObservedRunningTime="2025-12-05 08:43:34.261734702 +0000 UTC m=+1165.834338441" Dec 05 08:43:34 crc kubenswrapper[4795]: I1205 08:43:34.274539 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-b3d5-account-create-update-cmx25" podStartSLOduration=3.274519315 podStartE2EDuration="3.274519315s" podCreationTimestamp="2025-12-05 08:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:34.265885883 +0000 UTC m=+1165.838489622" watchObservedRunningTime="2025-12-05 08:43:34.274519315 +0000 UTC m=+1165.847123054" Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.252415 4795 generic.go:334] "Generic (PLEG): container finished" podID="e18679a7-6c6c-464c-92f0-893963f6a994" containerID="6b44e81296b76a731cca8a672c14c4685f5d418e47ecd16251cbe2ac0016fc53" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.252768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3d5-account-create-update-cmx25" event={"ID":"e18679a7-6c6c-464c-92f0-893963f6a994","Type":"ContainerDied","Data":"6b44e81296b76a731cca8a672c14c4685f5d418e47ecd16251cbe2ac0016fc53"} Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.257730 4795 generic.go:334] "Generic (PLEG): container finished" podID="47061433-b600-4907-8baf-77eb23065955" containerID="507426afa98f2c41056331e786e80503871054ca1ddc6d4a8a63a011306490dd" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.257823 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7bt8k" event={"ID":"47061433-b600-4907-8baf-77eb23065955","Type":"ContainerDied","Data":"507426afa98f2c41056331e786e80503871054ca1ddc6d4a8a63a011306490dd"} Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.260246 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6c781e8-96e9-48d0-b837-7d63996efd39" containerID="e1b168a69727606f44f05e214b8152cd080062dcf89e7dd9552572295767d625" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.260349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5de-account-create-update-wzl2s" event={"ID":"b6c781e8-96e9-48d0-b837-7d63996efd39","Type":"ContainerDied","Data":"e1b168a69727606f44f05e214b8152cd080062dcf89e7dd9552572295767d625"} Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.262471 4795 generic.go:334] "Generic (PLEG): container finished" podID="3340f491-333a-4403-8be5-0010ad46ece2" containerID="1ffd14242c1ab51541bd6266a5d2a2dcee44dcb3a851f7ee71b7490b377f1cd3" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.262531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8c-account-create-update-rc7kk" event={"ID":"3340f491-333a-4403-8be5-0010ad46ece2","Type":"ContainerDied","Data":"1ffd14242c1ab51541bd6266a5d2a2dcee44dcb3a851f7ee71b7490b377f1cd3"} Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.265537 4795 generic.go:334] "Generic (PLEG): container finished" podID="267818a6-5bea-4056-b190-a38341b02b4c" containerID="a840631bdd0a34a1e1595c6a48fc21300a16140e8d0dabb6cde42a48126b8bb8" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.265636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7zmc2" event={"ID":"267818a6-5bea-4056-b190-a38341b02b4c","Type":"ContainerDied","Data":"a840631bdd0a34a1e1595c6a48fc21300a16140e8d0dabb6cde42a48126b8bb8"} Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.278438 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e2b20c8-28b6-4f45-825b-0452d27fa54a" containerID="fd5427a200add60d5a335aa938c2d568a67b57e25bfc600f2cb942814225659f" exitCode=0 Dec 05 08:43:35 crc kubenswrapper[4795]: I1205 08:43:35.278511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2mxzn" event={"ID":"1e2b20c8-28b6-4f45-825b-0452d27fa54a","Type":"ContainerDied","Data":"fd5427a200add60d5a335aa938c2d568a67b57e25bfc600f2cb942814225659f"} Dec 05 08:43:36 crc kubenswrapper[4795]: E1205 08:43:36.410859 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:54752->38.102.83.222:40791: write tcp 38.102.83.222:54752->38.102.83.222:40791: write: broken pipe Dec 05 08:43:39 crc kubenswrapper[4795]: I1205 08:43:39.675047 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:43:39 crc kubenswrapper[4795]: I1205 08:43:39.767367 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:43:39 crc kubenswrapper[4795]: I1205 08:43:39.767753 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="dnsmasq-dns" containerID="cri-o://a9658456120763d7c91d99b658a97fd6695c0e422123fc69bcbfef0bc36839c5" gracePeriod=10 Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.150350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.156756 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.190271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.243793 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts\") pod \"3340f491-333a-4403-8be5-0010ad46ece2\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.253559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-884qs\" (UniqueName: \"kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs\") pod \"3340f491-333a-4403-8be5-0010ad46ece2\" (UID: \"3340f491-333a-4403-8be5-0010ad46ece2\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.253732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzxmz\" (UniqueName: \"kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz\") pod \"b6c781e8-96e9-48d0-b837-7d63996efd39\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.253958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts\") pod \"b6c781e8-96e9-48d0-b837-7d63996efd39\" (UID: \"b6c781e8-96e9-48d0-b837-7d63996efd39\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.251572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3340f491-333a-4403-8be5-0010ad46ece2" (UID: "3340f491-333a-4403-8be5-0010ad46ece2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.251889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.254829 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6c781e8-96e9-48d0-b837-7d63996efd39" (UID: "b6c781e8-96e9-48d0-b837-7d63996efd39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.255443 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6c781e8-96e9-48d0-b837-7d63996efd39-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.255477 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3340f491-333a-4403-8be5-0010ad46ece2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.258335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs" (OuterVolumeSpecName: "kube-api-access-884qs") pod "3340f491-333a-4403-8be5-0010ad46ece2" (UID: "3340f491-333a-4403-8be5-0010ad46ece2"). InnerVolumeSpecName "kube-api-access-884qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.265445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz" (OuterVolumeSpecName: "kube-api-access-xzxmz") pod "b6c781e8-96e9-48d0-b837-7d63996efd39" (UID: "b6c781e8-96e9-48d0-b837-7d63996efd39"). InnerVolumeSpecName "kube-api-access-xzxmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.350048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-7zmc2" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.350068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-7zmc2" event={"ID":"267818a6-5bea-4056-b190-a38341b02b4c","Type":"ContainerDied","Data":"fe2aba9d226a61a2122f91379d935f784b758b246d770c0589689309bd858c15"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.350137 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe2aba9d226a61a2122f91379d935f784b758b246d770c0589689309bd858c15" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.353587 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.354245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2mxzn" event={"ID":"1e2b20c8-28b6-4f45-825b-0452d27fa54a","Type":"ContainerDied","Data":"47e6b4ad81484f0f2b28374f30fb63be6fa54578fd8d89c49e1fbf72ab80a7e0"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.354284 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47e6b4ad81484f0f2b28374f30fb63be6fa54578fd8d89c49e1fbf72ab80a7e0" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.356531 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts\") pod \"47061433-b600-4907-8baf-77eb23065955\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.356739 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqf4j\" (UniqueName: \"kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j\") pod \"267818a6-5bea-4056-b190-a38341b02b4c\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.356806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts\") pod \"267818a6-5bea-4056-b190-a38341b02b4c\" (UID: \"267818a6-5bea-4056-b190-a38341b02b4c\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.356965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn8dh\" (UniqueName: \"kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh\") pod \"47061433-b600-4907-8baf-77eb23065955\" (UID: \"47061433-b600-4907-8baf-77eb23065955\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.357470 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-884qs\" (UniqueName: \"kubernetes.io/projected/3340f491-333a-4403-8be5-0010ad46ece2-kube-api-access-884qs\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.357489 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzxmz\" (UniqueName: \"kubernetes.io/projected/b6c781e8-96e9-48d0-b837-7d63996efd39-kube-api-access-xzxmz\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.358171 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerID="a9658456120763d7c91d99b658a97fd6695c0e422123fc69bcbfef0bc36839c5" exitCode=0 Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.358230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerDied","Data":"a9658456120763d7c91d99b658a97fd6695c0e422123fc69bcbfef0bc36839c5"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.359278 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47061433-b600-4907-8baf-77eb23065955" (UID: "47061433-b600-4907-8baf-77eb23065955"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.359720 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "267818a6-5bea-4056-b190-a38341b02b4c" (UID: "267818a6-5bea-4056-b190-a38341b02b4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.365568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j" (OuterVolumeSpecName: "kube-api-access-xqf4j") pod "267818a6-5bea-4056-b190-a38341b02b4c" (UID: "267818a6-5bea-4056-b190-a38341b02b4c"). InnerVolumeSpecName "kube-api-access-xqf4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.365865 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3d5-account-create-update-cmx25" event={"ID":"e18679a7-6c6c-464c-92f0-893963f6a994","Type":"ContainerDied","Data":"f7914e05f5f88be87b675053329fa618afabadfa7fb87b83892a88080537fd3f"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.365899 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7914e05f5f88be87b675053329fa618afabadfa7fb87b83892a88080537fd3f" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.365960 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3d5-account-create-update-cmx25" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.383765 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh" (OuterVolumeSpecName: "kube-api-access-mn8dh") pod "47061433-b600-4907-8baf-77eb23065955" (UID: "47061433-b600-4907-8baf-77eb23065955"). InnerVolumeSpecName "kube-api-access-mn8dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.388923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7bt8k" event={"ID":"47061433-b600-4907-8baf-77eb23065955","Type":"ContainerDied","Data":"d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.388972 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2b8264b3a8841640f4d697a118ab83e1dbfb2feaabda4af631922a627ac41ec" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.389038 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7bt8k" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.431935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a5de-account-create-update-wzl2s" event={"ID":"b6c781e8-96e9-48d0-b837-7d63996efd39","Type":"ContainerDied","Data":"f0c25b1d105b59ba6a5dd6779b27ecfcd56bc24f7e6bc3cec55ea1b2ed63ffb4"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.431991 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c25b1d105b59ba6a5dd6779b27ecfcd56bc24f7e6bc3cec55ea1b2ed63ffb4" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.432095 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a5de-account-create-update-wzl2s" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.433347 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.436134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8c-account-create-update-rc7kk" event={"ID":"3340f491-333a-4403-8be5-0010ad46ece2","Type":"ContainerDied","Data":"81b0be054fdea158dfc717c57a1985138560b5b578dc8403b2fdffd91d210e4f"} Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.436164 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b0be054fdea158dfc717c57a1985138560b5b578dc8403b2fdffd91d210e4f" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.436217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8c-account-create-update-rc7kk" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.459395 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwghj\" (UniqueName: \"kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj\") pod \"e18679a7-6c6c-464c-92f0-893963f6a994\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.459658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts\") pod \"e18679a7-6c6c-464c-92f0-893963f6a994\" (UID: \"e18679a7-6c6c-464c-92f0-893963f6a994\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.461121 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e18679a7-6c6c-464c-92f0-893963f6a994" (UID: "e18679a7-6c6c-464c-92f0-893963f6a994"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.465223 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn8dh\" (UniqueName: \"kubernetes.io/projected/47061433-b600-4907-8baf-77eb23065955-kube-api-access-mn8dh\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.465266 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47061433-b600-4907-8baf-77eb23065955-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.465282 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqf4j\" (UniqueName: \"kubernetes.io/projected/267818a6-5bea-4056-b190-a38341b02b4c-kube-api-access-xqf4j\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.465293 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e18679a7-6c6c-464c-92f0-893963f6a994-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.465307 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/267818a6-5bea-4056-b190-a38341b02b4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.467010 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj" (OuterVolumeSpecName: "kube-api-access-gwghj") pod "e18679a7-6c6c-464c-92f0-893963f6a994" (UID: "e18679a7-6c6c-464c-92f0-893963f6a994"). InnerVolumeSpecName "kube-api-access-gwghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.520905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.567960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts\") pod \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.568476 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e2b20c8-28b6-4f45-825b-0452d27fa54a" (UID: "1e2b20c8-28b6-4f45-825b-0452d27fa54a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.568739 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggvqb\" (UniqueName: \"kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb\") pod \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\" (UID: \"1e2b20c8-28b6-4f45-825b-0452d27fa54a\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.572710 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e2b20c8-28b6-4f45-825b-0452d27fa54a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.572748 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwghj\" (UniqueName: \"kubernetes.io/projected/e18679a7-6c6c-464c-92f0-893963f6a994-kube-api-access-gwghj\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.577944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb" (OuterVolumeSpecName: "kube-api-access-ggvqb") pod "1e2b20c8-28b6-4f45-825b-0452d27fa54a" (UID: "1e2b20c8-28b6-4f45-825b-0452d27fa54a"). InnerVolumeSpecName "kube-api-access-ggvqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.673660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb\") pod \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.673760 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc\") pod \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.673848 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb6bq\" (UniqueName: \"kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq\") pod \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.673892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb\") pod \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.674033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config\") pod \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\" (UID: \"fa752e5e-08d3-49db-b564-8efe2d39b1ca\") " Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.674393 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggvqb\" (UniqueName: \"kubernetes.io/projected/1e2b20c8-28b6-4f45-825b-0452d27fa54a-kube-api-access-ggvqb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.686545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq" (OuterVolumeSpecName: "kube-api-access-lb6bq") pod "fa752e5e-08d3-49db-b564-8efe2d39b1ca" (UID: "fa752e5e-08d3-49db-b564-8efe2d39b1ca"). InnerVolumeSpecName "kube-api-access-lb6bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.726116 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa752e5e-08d3-49db-b564-8efe2d39b1ca" (UID: "fa752e5e-08d3-49db-b564-8efe2d39b1ca"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.727961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa752e5e-08d3-49db-b564-8efe2d39b1ca" (UID: "fa752e5e-08d3-49db-b564-8efe2d39b1ca"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.730672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa752e5e-08d3-49db-b564-8efe2d39b1ca" (UID: "fa752e5e-08d3-49db-b564-8efe2d39b1ca"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.748465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config" (OuterVolumeSpecName: "config") pod "fa752e5e-08d3-49db-b564-8efe2d39b1ca" (UID: "fa752e5e-08d3-49db-b564-8efe2d39b1ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.775828 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.775866 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.775881 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.775892 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb6bq\" (UniqueName: \"kubernetes.io/projected/fa752e5e-08d3-49db-b564-8efe2d39b1ca-kube-api-access-lb6bq\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:40 crc kubenswrapper[4795]: I1205 08:43:40.775901 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa752e5e-08d3-49db-b564-8efe2d39b1ca-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.447295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5h4x" event={"ID":"3ff5addc-f5af-4fac-b0ff-d5f34d238e69","Type":"ContainerStarted","Data":"894be04402414d23cd3d7ef4faab946243b79939d8d1ebb8f7875e2424a4b193"} Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.449492 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2mxzn" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.449500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" event={"ID":"fa752e5e-08d3-49db-b564-8efe2d39b1ca","Type":"ContainerDied","Data":"85514fb445dc69d5b7419133eac3da44fd2d15eec44d8dd97dd714ba758479f6"} Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.449691 4795 scope.go:117] "RemoveContainer" containerID="a9658456120763d7c91d99b658a97fd6695c0e422123fc69bcbfef0bc36839c5" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.449526 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-jpnk4" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.481329 4795 scope.go:117] "RemoveContainer" containerID="d9d2ec164824abd94a54796a572c3d7786ff94286bd52cd8b9aab2d31447ff8b" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.485562 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-w5h4x" podStartSLOduration=3.794476184 podStartE2EDuration="10.485534965s" podCreationTimestamp="2025-12-05 08:43:31 +0000 UTC" firstStartedPulling="2025-12-05 08:43:33.50689716 +0000 UTC m=+1165.079500899" lastFinishedPulling="2025-12-05 08:43:40.197955941 +0000 UTC m=+1171.770559680" observedRunningTime="2025-12-05 08:43:41.483776657 +0000 UTC m=+1173.056380396" watchObservedRunningTime="2025-12-05 08:43:41.485534965 +0000 UTC m=+1173.058138694" Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.517361 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:43:41 crc kubenswrapper[4795]: I1205 08:43:41.541123 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-jpnk4"] Dec 05 08:43:42 crc kubenswrapper[4795]: I1205 08:43:42.758030 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" path="/var/lib/kubelet/pods/fa752e5e-08d3-49db-b564-8efe2d39b1ca/volumes" Dec 05 08:43:45 crc kubenswrapper[4795]: I1205 08:43:45.490780 4795 generic.go:334] "Generic (PLEG): container finished" podID="3ff5addc-f5af-4fac-b0ff-d5f34d238e69" containerID="894be04402414d23cd3d7ef4faab946243b79939d8d1ebb8f7875e2424a4b193" exitCode=0 Dec 05 08:43:45 crc kubenswrapper[4795]: I1205 08:43:45.490979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5h4x" event={"ID":"3ff5addc-f5af-4fac-b0ff-d5f34d238e69","Type":"ContainerDied","Data":"894be04402414d23cd3d7ef4faab946243b79939d8d1ebb8f7875e2424a4b193"} Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.888886 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.928054 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nblj7\" (UniqueName: \"kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7\") pod \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.928170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data\") pod \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.928408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle\") pod \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\" (UID: \"3ff5addc-f5af-4fac-b0ff-d5f34d238e69\") " Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.937991 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7" (OuterVolumeSpecName: "kube-api-access-nblj7") pod "3ff5addc-f5af-4fac-b0ff-d5f34d238e69" (UID: "3ff5addc-f5af-4fac-b0ff-d5f34d238e69"). InnerVolumeSpecName "kube-api-access-nblj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:46 crc kubenswrapper[4795]: I1205 08:43:46.968412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff5addc-f5af-4fac-b0ff-d5f34d238e69" (UID: "3ff5addc-f5af-4fac-b0ff-d5f34d238e69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.004035 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data" (OuterVolumeSpecName: "config-data") pod "3ff5addc-f5af-4fac-b0ff-d5f34d238e69" (UID: "3ff5addc-f5af-4fac-b0ff-d5f34d238e69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.030862 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.030917 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.030933 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nblj7\" (UniqueName: \"kubernetes.io/projected/3ff5addc-f5af-4fac-b0ff-d5f34d238e69-kube-api-access-nblj7\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.516129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-w5h4x" event={"ID":"3ff5addc-f5af-4fac-b0ff-d5f34d238e69","Type":"ContainerDied","Data":"ac6368d37d97d9b097682ebbe5e3ad3cd0e2974d3daa0e6834d3cf2717cc1621"} Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.516443 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac6368d37d97d9b097682ebbe5e3ad3cd0e2974d3daa0e6834d3cf2717cc1621" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.516252 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-w5h4x" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.857163 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858058 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff5addc-f5af-4fac-b0ff-d5f34d238e69" containerName="keystone-db-sync" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858096 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff5addc-f5af-4fac-b0ff-d5f34d238e69" containerName="keystone-db-sync" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858117 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267818a6-5bea-4056-b190-a38341b02b4c" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="267818a6-5bea-4056-b190-a38341b02b4c" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47061433-b600-4907-8baf-77eb23065955" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858175 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="47061433-b600-4907-8baf-77eb23065955" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858190 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="init" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="init" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e2b20c8-28b6-4f45-825b-0452d27fa54a" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e2b20c8-28b6-4f45-825b-0452d27fa54a" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858252 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="dnsmasq-dns" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858259 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="dnsmasq-dns" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3340f491-333a-4403-8be5-0010ad46ece2" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858278 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3340f491-333a-4403-8be5-0010ad46ece2" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858286 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18679a7-6c6c-464c-92f0-893963f6a994" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858295 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18679a7-6c6c-464c-92f0-893963f6a994" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: E1205 08:43:47.858306 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c781e8-96e9-48d0-b837-7d63996efd39" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858328 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c781e8-96e9-48d0-b837-7d63996efd39" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858536 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff5addc-f5af-4fac-b0ff-d5f34d238e69" containerName="keystone-db-sync" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858575 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3340f491-333a-4403-8be5-0010ad46ece2" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858590 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c781e8-96e9-48d0-b837-7d63996efd39" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858676 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18679a7-6c6c-464c-92f0-893963f6a994" containerName="mariadb-account-create-update" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858698 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa752e5e-08d3-49db-b564-8efe2d39b1ca" containerName="dnsmasq-dns" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858712 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e2b20c8-28b6-4f45-825b-0452d27fa54a" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858722 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="47061433-b600-4907-8baf-77eb23065955" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.858730 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="267818a6-5bea-4056-b190-a38341b02b4c" containerName="mariadb-database-create" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.860038 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.870925 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.939300 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rsp78"] Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.940733 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.965587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.965804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.965963 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966081 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmhr9\" (UniqueName: \"kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.966243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c75xn" Dec 05 08:43:47 crc kubenswrapper[4795]: I1205 08:43:47.997584 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsp78"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.068664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.068825 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmhr9\" (UniqueName: \"kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.068969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcvzt\" (UniqueName: \"kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.068993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.069657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.072490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.074545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.075419 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.076997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.078069 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.098959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmhr9\" (UniqueName: \"kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9\") pod \"dnsmasq-dns-bbf5cc879-vgkrx\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcvzt\" (UniqueName: \"kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.172972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.189648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.190012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.194595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.202859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.204389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.223957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.243584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcvzt\" (UniqueName: \"kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt\") pod \"keystone-bootstrap-rsp78\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.308480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.312949 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.318830 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.327745 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-d5xmd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.328911 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.329136 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.329406 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.375358 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.383118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.383184 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v452\" (UniqueName: \"kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.383214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.383241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.383279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.434083 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-cq4gw"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.435409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.454524 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hrknz" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.454786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.454985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.460836 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-bxt72"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.462131 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.515972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v452\" (UniqueName: \"kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.516421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.516856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.519172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.519505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.521759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.523497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.526019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.529261 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rls2n" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.519017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.536461 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.626305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.630384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.630664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p248j\" (UniqueName: \"kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.630756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.630880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.630919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.631237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.631321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.631569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxztf\" (UniqueName: \"kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.642509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v452\" (UniqueName: \"kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.652066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key\") pod \"horizon-649bd54f-sw6jd\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.678758 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p248j\" (UniqueName: \"kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739643 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxztf\" (UniqueName: \"kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.739755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.773797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.779198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.791816 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-nxkbh"] Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.800750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.806834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.807585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.811793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.814982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.868949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxztf\" (UniqueName: \"kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf\") pod \"cinder-db-sync-cq4gw\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.938352 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.947717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.947793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.947871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.947899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.948118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvd7w\" (UniqueName: \"kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.950387 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p248j\" (UniqueName: \"kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j\") pod \"neutron-db-sync-bxt72\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.954001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxt72" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.963310 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.965798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 08:43:48 crc kubenswrapper[4795]: I1205 08:43:48.966018 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rkxnk" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.017363 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bxt72"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.042918 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-k28kg"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.044215 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.048201 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.048296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-czt5p" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.049786 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.049821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.049865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.049886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.049974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvd7w\" (UniqueName: \"kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.050645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.064483 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.075189 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.078476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.082990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.092263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvd7w\" (UniqueName: \"kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w\") pod \"placement-db-sync-nxkbh\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.126434 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cq4gw"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.137684 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k28kg"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.155546 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.155678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6lbn\" (UniqueName: \"kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.155718 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.216841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nxkbh"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.227781 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.229701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.261758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262191 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8zh\" (UniqueName: \"kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6lbn\" (UniqueName: \"kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262251 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.262372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.310918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.311226 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.327582 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6lbn\" (UniqueName: \"kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn\") pod \"barbican-db-sync-k28kg\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.329007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nxkbh" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.329834 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.365096 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.369162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.369239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8zh\" (UniqueName: \"kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.369286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.369359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.369387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.377826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.384002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.402184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k28kg" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.429087 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.439433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.439898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.450493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.461374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.465812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcj8r\" (UniqueName: \"kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472177 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.472299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.507296 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.509332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.512402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8zh\" (UniqueName: \"kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh\") pod \"horizon-665c6496b5-rcd65\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.542675 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.579827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.580078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.587091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqrs\" (UniqueName: \"kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcj8r\" (UniqueName: \"kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589820 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.589950 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.590091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.593384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.597552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.603824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.621374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.621824 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.622723 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.626707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcj8r\" (UniqueName: \"kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r\") pod \"ceilometer-0\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.683123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" event={"ID":"ce748b74-fe9d-4534-8c66-f34c5883c93b","Type":"ContainerStarted","Data":"c7aec798dffb8804c85d54623446e91fe6d90fd6da1144a2b81339cfe719d00b"} Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqrs\" (UniqueName: \"kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.691962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.693051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.694175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.694337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.694784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.694970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.700441 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.704449 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.709262 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.709502 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.709677 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wbtsl" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.709789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.717801 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.758955 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.763555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.779897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqrs\" (UniqueName: \"kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs\") pod \"dnsmasq-dns-56df8fb6b7-jtjqv\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.784749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.785351 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.801986 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.823371 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.824034 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.835039 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.933346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936360 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wlw6\" (UniqueName: \"kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.936975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfvw\" (UniqueName: \"kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.937046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.937092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.937154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.937185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:49 crc kubenswrapper[4795]: I1205 08:43:49.937220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wlw6\" (UniqueName: \"kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048405 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048536 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfvw\" (UniqueName: \"kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.048777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.049861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.049930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.049975 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.050024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.050157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.050256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.063215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.064757 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.066180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.066180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.068026 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.073547 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.086858 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.088710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.089319 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfvw\" (UniqueName: \"kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.110574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wlw6\" (UniqueName: \"kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.111023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.113664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.114241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.116113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.125423 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rsp78"] Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.165662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.166467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.198484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.364701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.392295 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.393132 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-bxt72"] Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.435505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.604549 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-cq4gw"] Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.637448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.891669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-649bd54f-sw6jd" event={"ID":"ffa70a61-cdc5-40fa-a9bd-338a244659c4","Type":"ContainerStarted","Data":"731411a3a0c93a7ae6e24f36a11e9ef5c338429e82eb8d6c1722cca2db523713"} Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.891991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cq4gw" event={"ID":"dd6ce9d5-263a-4b05-83e5-c349f0038001","Type":"ContainerStarted","Data":"179a1297321689ba17b297789e4615ffb4fa777af5a4d0361e911112e839038a"} Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.920727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsp78" event={"ID":"04bd2c9e-9b28-45df-bbb7-4da844ddec3d","Type":"ContainerStarted","Data":"0712dac66cbe9c5b9f775c63d8428d48b20258e44ee65ef7207cbedb28335518"} Dec 05 08:43:50 crc kubenswrapper[4795]: I1205 08:43:50.924890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxt72" event={"ID":"7de95e6b-d594-4ed4-8b8d-041346856347","Type":"ContainerStarted","Data":"e825e4fbaedb8e968ed22153cc1de548ce30d2362815f0d54881c381a2052a21"} Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.064919 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-nxkbh"] Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.154669 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-k28kg"] Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.279513 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:43:51 crc kubenswrapper[4795]: W1205 08:43:51.293251 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3d42fb_b0fc_4388_b7fc_17ca1fb7a4ff.slice/crio-b60b7281f4294237813b8573c9fdbfca9fb443e275b0d406005dfca4d8609c7b WatchSource:0}: Error finding container b60b7281f4294237813b8573c9fdbfca9fb443e275b0d406005dfca4d8609c7b: Status 404 returned error can't find the container with id b60b7281f4294237813b8573c9fdbfca9fb443e275b0d406005dfca4d8609c7b Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.595382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.832872 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:43:51 crc kubenswrapper[4795]: W1205 08:43:51.846220 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83eff3d1_a4b3_4d67_9131_57df81514ccb.slice/crio-8a61dbbc97d61fa0a96e35176ab5f32d28f05f44bf1f540db36f93cf1ef089d8 WatchSource:0}: Error finding container 8a61dbbc97d61fa0a96e35176ab5f32d28f05f44bf1f540db36f93cf1ef089d8: Status 404 returned error can't find the container with id 8a61dbbc97d61fa0a96e35176ab5f32d28f05f44bf1f540db36f93cf1ef089d8 Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.959114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerStarted","Data":"a9ca994b258a47134d9986b79c9412e8194ae0281c14550fa312f7b7a0fac6e6"} Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.964797 4795 generic.go:334] "Generic (PLEG): container finished" podID="ce748b74-fe9d-4534-8c66-f34c5883c93b" containerID="ef875a96414a8fb97aeae987eb5f1bbc23cf406f49362fc122c4508ac283a1be" exitCode=0 Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.964870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" event={"ID":"ce748b74-fe9d-4534-8c66-f34c5883c93b","Type":"ContainerDied","Data":"ef875a96414a8fb97aeae987eb5f1bbc23cf406f49362fc122c4508ac283a1be"} Dec 05 08:43:51 crc kubenswrapper[4795]: I1205 08:43:51.985813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665c6496b5-rcd65" event={"ID":"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff","Type":"ContainerStarted","Data":"b60b7281f4294237813b8573c9fdbfca9fb443e275b0d406005dfca4d8609c7b"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.012709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nxkbh" event={"ID":"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87","Type":"ContainerStarted","Data":"2a40f844943afb15b416942f6ae9fad0e63fe182cead92a68aa515b73d6ca0b4"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.032331 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.051400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsp78" event={"ID":"04bd2c9e-9b28-45df-bbb7-4da844ddec3d","Type":"ContainerStarted","Data":"a19b2ed86ada4f3fd6d8dc18a552df4ab151f3651f6e0c08cf9d9885ead1f563"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.059394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k28kg" event={"ID":"44d62cd1-585f-4756-b3f9-6f0714ea3248","Type":"ContainerStarted","Data":"8004cfea5dd019a63db6cf532b45cfde2e4bbac38423edb21273d54b43005f95"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.062219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxt72" event={"ID":"7de95e6b-d594-4ed4-8b8d-041346856347","Type":"ContainerStarted","Data":"21ba3b63122aef1bb5134dde6523b67bf9263ad14895e2960724b5a4d48c62a2"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.069024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" event={"ID":"83eff3d1-a4b3-4d67-9131-57df81514ccb","Type":"ContainerStarted","Data":"8a61dbbc97d61fa0a96e35176ab5f32d28f05f44bf1f540db36f93cf1ef089d8"} Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.104770 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rsp78" podStartSLOduration=5.10473755 podStartE2EDuration="5.10473755s" podCreationTimestamp="2025-12-05 08:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:52.088997759 +0000 UTC m=+1183.661601498" watchObservedRunningTime="2025-12-05 08:43:52.10473755 +0000 UTC m=+1183.677341289" Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.140002 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-bxt72" podStartSLOduration=4.139975096 podStartE2EDuration="4.139975096s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:52.132976089 +0000 UTC m=+1183.705579828" watchObservedRunningTime="2025-12-05 08:43:52.139975096 +0000 UTC m=+1183.712578835" Dec 05 08:43:52 crc kubenswrapper[4795]: I1205 08:43:52.446573 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.011076 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129203 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129276 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmhr9\" (UniqueName: \"kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129457 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.129495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc\") pod \"ce748b74-fe9d-4534-8c66-f34c5883c93b\" (UID: \"ce748b74-fe9d-4534-8c66-f34c5883c93b\") " Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.161968 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9" (OuterVolumeSpecName: "kube-api-access-kmhr9") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "kube-api-access-kmhr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.213011 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.216647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerStarted","Data":"0b4a83070fd7a5be185b4760695daa06b2802d3f4522e49c8a03e14486d1f31b"} Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.232293 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmhr9\" (UniqueName: \"kubernetes.io/projected/ce748b74-fe9d-4534-8c66-f34c5883c93b-kube-api-access-kmhr9\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.232330 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.247240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" event={"ID":"ce748b74-fe9d-4534-8c66-f34c5883c93b","Type":"ContainerDied","Data":"c7aec798dffb8804c85d54623446e91fe6d90fd6da1144a2b81339cfe719d00b"} Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.247293 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vgkrx" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.247331 4795 scope.go:117] "RemoveContainer" containerID="ef875a96414a8fb97aeae987eb5f1bbc23cf406f49362fc122c4508ac283a1be" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.264846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.277432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerStarted","Data":"96a2f96d37c159b3a723a2cbc8f82e7cdfd33b58a63548632f400a8c0d819dc5"} Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.280478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config" (OuterVolumeSpecName: "config") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.301180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.302702 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce748b74-fe9d-4534-8c66-f34c5883c93b" (UID: "ce748b74-fe9d-4534-8c66-f34c5883c93b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.333952 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.334169 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.334186 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.334195 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce748b74-fe9d-4534-8c66-f34c5883c93b-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.673245 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:53 crc kubenswrapper[4795]: I1205 08:43:53.707319 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vgkrx"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.300934 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.345888 4795 generic.go:334] "Generic (PLEG): container finished" podID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerID="b06ef4451be2a4a4ab957d3d2824334f89c2f97e2f81be14b44a905a65540d3b" exitCode=0 Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.345970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" event={"ID":"83eff3d1-a4b3-4d67-9131-57df81514ccb","Type":"ContainerDied","Data":"b06ef4451be2a4a4ab957d3d2824334f89c2f97e2f81be14b44a905a65540d3b"} Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.397201 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.522899 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:43:54 crc kubenswrapper[4795]: E1205 08:43:54.523546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce748b74-fe9d-4534-8c66-f34c5883c93b" containerName="init" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.523566 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce748b74-fe9d-4534-8c66-f34c5883c93b" containerName="init" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.523906 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce748b74-fe9d-4534-8c66-f34c5883c93b" containerName="init" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.525252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.565604 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.593289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.593364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44qb\" (UniqueName: \"kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.593401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.593422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.593480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.604365 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.701981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44qb\" (UniqueName: \"kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.702488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.702523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.702601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.702717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.703733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.705056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.705310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.717462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.750342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44qb\" (UniqueName: \"kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb\") pod \"horizon-5567c5f465-m6ld7\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.825976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce748b74-fe9d-4534-8c66-f34c5883c93b" path="/var/lib/kubelet/pods/ce748b74-fe9d-4534-8c66-f34c5883c93b/volumes" Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.826748 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:43:54 crc kubenswrapper[4795]: I1205 08:43:54.967669 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.531631 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerStarted","Data":"27687f242406633b549c2946da86296cb1c41edb3375ff0fe1407a70b911e05f"} Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.535445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" event={"ID":"83eff3d1-a4b3-4d67-9131-57df81514ccb","Type":"ContainerStarted","Data":"ef7cc32e1710040e07241a8eae09249e3ff7a8b77c91d42ca6b651731e32f350"} Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.536808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.557213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerStarted","Data":"a691754537bdc81855a8028b7ad8b943c33dbd92273b25bd8306e3d1f6f24ff1"} Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.591719 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" podStartSLOduration=7.59169895 podStartE2EDuration="7.59169895s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:55.568295353 +0000 UTC m=+1187.140899112" watchObservedRunningTime="2025-12-05 08:43:55.59169895 +0000 UTC m=+1187.164302689" Dec 05 08:43:55 crc kubenswrapper[4795]: I1205 08:43:55.641365 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:43:56 crc kubenswrapper[4795]: I1205 08:43:56.658969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5567c5f465-m6ld7" event={"ID":"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4","Type":"ContainerStarted","Data":"5d5607515e2b4b1c04915bd7861edd9c1c8f8069c5c6aa63dcaa14123331ab36"} Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.693157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerStarted","Data":"55818e2c87c4b1bdd6a6989b37063240c3f025bd8d7cf9d1b33ee9990cac7960"} Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.693377 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-log" containerID="cri-o://a691754537bdc81855a8028b7ad8b943c33dbd92273b25bd8306e3d1f6f24ff1" gracePeriod=30 Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.693693 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-httpd" containerID="cri-o://55818e2c87c4b1bdd6a6989b37063240c3f025bd8d7cf9d1b33ee9990cac7960" gracePeriod=30 Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.699448 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-log" containerID="cri-o://27687f242406633b549c2946da86296cb1c41edb3375ff0fe1407a70b911e05f" gracePeriod=30 Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.699557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerStarted","Data":"05d430e8bac2b5a0296645df39509589b453d237c5eb17617dd222904217d027"} Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.713680 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-httpd" containerID="cri-o://05d430e8bac2b5a0296645df39509589b453d237c5eb17617dd222904217d027" gracePeriod=30 Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.768207 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.768188442 podStartE2EDuration="8.768188442s" podCreationTimestamp="2025-12-05 08:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:57.766894088 +0000 UTC m=+1189.339497817" watchObservedRunningTime="2025-12-05 08:43:57.768188442 +0000 UTC m=+1189.340792181" Dec 05 08:43:57 crc kubenswrapper[4795]: I1205 08:43:57.833669 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.825584863 podStartE2EDuration="9.825584863s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:43:57.814214558 +0000 UTC m=+1189.386818297" watchObservedRunningTime="2025-12-05 08:43:57.825584863 +0000 UTC m=+1189.398188602" Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.748866 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerID="55818e2c87c4b1bdd6a6989b37063240c3f025bd8d7cf9d1b33ee9990cac7960" exitCode=143 Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.749328 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerID="a691754537bdc81855a8028b7ad8b943c33dbd92273b25bd8306e3d1f6f24ff1" exitCode=143 Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.755998 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerID="05d430e8bac2b5a0296645df39509589b453d237c5eb17617dd222904217d027" exitCode=143 Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.756045 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerID="27687f242406633b549c2946da86296cb1c41edb3375ff0fe1407a70b911e05f" exitCode=143 Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.769789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerDied","Data":"55818e2c87c4b1bdd6a6989b37063240c3f025bd8d7cf9d1b33ee9990cac7960"} Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.769866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerDied","Data":"a691754537bdc81855a8028b7ad8b943c33dbd92273b25bd8306e3d1f6f24ff1"} Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.769881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerDied","Data":"05d430e8bac2b5a0296645df39509589b453d237c5eb17617dd222904217d027"} Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.769897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerDied","Data":"27687f242406633b549c2946da86296cb1c41edb3375ff0fe1407a70b911e05f"} Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.913999 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980330 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wlw6\" (UniqueName: \"kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.980581 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e6daad2c-c908-49b6-833b-9f762954ac4c\" (UID: \"e6daad2c-c908-49b6-833b-9f762954ac4c\") " Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.993317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:43:58 crc kubenswrapper[4795]: I1205 08:43:58.993858 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:58.997070 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs" (OuterVolumeSpecName: "logs") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.004811 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts" (OuterVolumeSpecName: "scripts") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.014579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6" (OuterVolumeSpecName: "kube-api-access-9wlw6") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "kube-api-access-9wlw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.023556 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.102279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wlw6\" (UniqueName: \"kubernetes.io/projected/e6daad2c-c908-49b6-833b-9f762954ac4c-kube-api-access-9wlw6\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.102507 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.102541 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.102551 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6daad2c-c908-49b6-833b-9f762954ac4c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.141720 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.168741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data" (OuterVolumeSpecName: "config-data") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.169314 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.204164 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.204206 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.204219 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.225058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6daad2c-c908-49b6-833b-9f762954ac4c" (UID: "e6daad2c-c908-49b6-833b-9f762954ac4c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.309160 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6daad2c-c908-49b6-833b-9f762954ac4c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.431661 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.517944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvfvw\" (UniqueName: \"kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.518010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\" (UID: \"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c\") " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.519015 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.531042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs" (OuterVolumeSpecName: "logs") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.536789 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.538819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.562826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts" (OuterVolumeSpecName: "scripts") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.564096 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw" (OuterVolumeSpecName: "kube-api-access-tvfvw") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "kube-api-access-tvfvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.620890 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.620934 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.620943 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.620954 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.620965 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvfvw\" (UniqueName: \"kubernetes.io/projected/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-kube-api-access-tvfvw\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.621174 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:43:59 crc kubenswrapper[4795]: E1205 08:43:59.621793 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.621825 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: E1205 08:43:59.621894 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.621908 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: E1205 08:43:59.621929 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.621995 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: E1205 08:43:59.622053 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.622064 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.622454 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.622779 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.622797 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" containerName="glance-httpd" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.622832 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" containerName="glance-log" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.624236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.669454 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.708923 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.725861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.725926 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.725956 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.726025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.726064 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.726252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.726347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp5j\" (UniqueName: \"kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.726494 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.740444 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.778652 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.779149 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data" (OuterVolumeSpecName: "config-data") pod "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" (UID: "8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.819120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp5j\" (UniqueName: \"kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.834925 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.835015 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.835030 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.835042 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.855646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.858243 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.857211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.899926 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.900148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c","Type":"ContainerDied","Data":"96a2f96d37c159b3a723a2cbc8f82e7cdfd33b58a63548632f400a8c0d819dc5"} Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.900199 4795 scope.go:117] "RemoveContainer" containerID="55818e2c87c4b1bdd6a6989b37063240c3f025bd8d7cf9d1b33ee9990cac7960" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.918131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.921458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.924178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.930327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp5j\" (UniqueName: \"kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.939531 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs\") pod \"horizon-797f5f5996-7wlp4\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.939988 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6daad2c-c908-49b6-833b-9f762954ac4c","Type":"ContainerDied","Data":"0b4a83070fd7a5be185b4760695daa06b2802d3f4522e49c8a03e14486d1f31b"} Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.941328 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57b485fdb4-h9cjs"] Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.946764 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.948575 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:43:59 crc kubenswrapper[4795]: I1205 08:43:59.957109 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57b485fdb4-h9cjs"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.031935 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040074 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w92cn\" (UniqueName: \"kubernetes.io/projected/f89d9173-0065-4beb-a1b6-ba7be5094a58-kube-api-access-w92cn\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-scripts\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89d9173-0065-4beb-a1b6-ba7be5094a58-logs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-tls-certs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-config-data\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-combined-ca-bundle\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.040363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-secret-key\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.068560 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.090900 4795 scope.go:117] "RemoveContainer" containerID="a691754537bdc81855a8028b7ad8b943c33dbd92273b25bd8306e3d1f6f24ff1" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.103581 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.123136 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.142805 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w92cn\" (UniqueName: \"kubernetes.io/projected/f89d9173-0065-4beb-a1b6-ba7be5094a58-kube-api-access-w92cn\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.142889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-scripts\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.142924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89d9173-0065-4beb-a1b6-ba7be5094a58-logs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.142952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-tls-certs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.142990 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-config-data\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.143126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-combined-ca-bundle\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.143189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-secret-key\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.143964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f89d9173-0065-4beb-a1b6-ba7be5094a58-logs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.144826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-scripts\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.145411 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.152190 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f89d9173-0065-4beb-a1b6-ba7be5094a58-config-data\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.159718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-secret-key\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.160722 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-combined-ca-bundle\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.161494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f89d9173-0065-4beb-a1b6-ba7be5094a58-horizon-tls-certs\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.175186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w92cn\" (UniqueName: \"kubernetes.io/projected/f89d9173-0065-4beb-a1b6-ba7be5094a58-kube-api-access-w92cn\") pod \"horizon-57b485fdb4-h9cjs\" (UID: \"f89d9173-0065-4beb-a1b6-ba7be5094a58\") " pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.213888 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.215737 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.220517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.221163 4795 scope.go:117] "RemoveContainer" containerID="05d430e8bac2b5a0296645df39509589b453d237c5eb17617dd222904217d027" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.222272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.222484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.222719 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wbtsl" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.235538 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.286316 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.298369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.310418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.311392 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.317196 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.351860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.351940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.351971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.351990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352095 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tfc4\" (UniqueName: \"kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352205 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v4ng\" (UniqueName: \"kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.352376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.357522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.434057 4795 scope.go:117] "RemoveContainer" containerID="27687f242406633b549c2946da86296cb1c41edb3375ff0fe1407a70b911e05f" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.461909 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.461971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462007 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462034 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462142 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tfc4\" (UniqueName: \"kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462406 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v4ng\" (UniqueName: \"kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.462869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.463748 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.470914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.471225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.471575 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.477786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.478809 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.479850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.484279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.485997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.503703 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.552718 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tfc4\" (UniqueName: \"kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.554235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.565447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.591592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.592965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v4ng\" (UniqueName: \"kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.593066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.642881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.654854 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.762255 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c" path="/var/lib/kubelet/pods/8c20ee93-67b8-4b21-8fa7-7b2dd7d55d3c/volumes" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.763347 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6daad2c-c908-49b6-833b-9f762954ac4c" path="/var/lib/kubelet/pods/e6daad2c-c908-49b6-833b-9f762954ac4c/volumes" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.883686 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:00 crc kubenswrapper[4795]: I1205 08:44:00.976825 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:44:02 crc kubenswrapper[4795]: I1205 08:44:02.014402 4795 generic.go:334] "Generic (PLEG): container finished" podID="04bd2c9e-9b28-45df-bbb7-4da844ddec3d" containerID="a19b2ed86ada4f3fd6d8dc18a552df4ab151f3651f6e0c08cf9d9885ead1f563" exitCode=0 Dec 05 08:44:02 crc kubenswrapper[4795]: I1205 08:44:02.014538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsp78" event={"ID":"04bd2c9e-9b28-45df-bbb7-4da844ddec3d","Type":"ContainerDied","Data":"a19b2ed86ada4f3fd6d8dc18a552df4ab151f3651f6e0c08cf9d9885ead1f563"} Dec 05 08:44:04 crc kubenswrapper[4795]: W1205 08:44:04.008732 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821b3890_4d8d_4ce0_b3b2_55793a9c98cd.slice/crio-88f716fc372ac5bdaf05e54717c04f9411af1fe14f02f0e5a9c4d7a63456e75b WatchSource:0}: Error finding container 88f716fc372ac5bdaf05e54717c04f9411af1fe14f02f0e5a9c4d7a63456e75b: Status 404 returned error can't find the container with id 88f716fc372ac5bdaf05e54717c04f9411af1fe14f02f0e5a9c4d7a63456e75b Dec 05 08:44:04 crc kubenswrapper[4795]: I1205 08:44:04.056509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"88f716fc372ac5bdaf05e54717c04f9411af1fe14f02f0e5a9c4d7a63456e75b"} Dec 05 08:44:04 crc kubenswrapper[4795]: I1205 08:44:04.935026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:44:05 crc kubenswrapper[4795]: I1205 08:44:05.019410 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:44:05 crc kubenswrapper[4795]: I1205 08:44:05.019836 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" containerID="cri-o://538a6868476e70c73ffd2e7f712d1babf028ebe8555d570de98a39598dc05838" gracePeriod=10 Dec 05 08:44:06 crc kubenswrapper[4795]: I1205 08:44:06.108425 4795 generic.go:334] "Generic (PLEG): container finished" podID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerID="538a6868476e70c73ffd2e7f712d1babf028ebe8555d570de98a39598dc05838" exitCode=0 Dec 05 08:44:06 crc kubenswrapper[4795]: I1205 08:44:06.108815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" event={"ID":"b6edcee3-faa8-4de5-8a83-d0dd6803844a","Type":"ContainerDied","Data":"538a6868476e70c73ffd2e7f712d1babf028ebe8555d570de98a39598dc05838"} Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.015794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.121317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rsp78" event={"ID":"04bd2c9e-9b28-45df-bbb7-4da844ddec3d","Type":"ContainerDied","Data":"0712dac66cbe9c5b9f775c63d8428d48b20258e44ee65ef7207cbedb28335518"} Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.121375 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0712dac66cbe9c5b9f775c63d8428d48b20258e44ee65ef7207cbedb28335518" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.121419 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rsp78" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195073 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcvzt\" (UniqueName: \"kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195381 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195478 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.195830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts\") pod \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\" (UID: \"04bd2c9e-9b28-45df-bbb7-4da844ddec3d\") " Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.203881 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.205592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.206634 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts" (OuterVolumeSpecName: "scripts") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.218647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt" (OuterVolumeSpecName: "kube-api-access-bcvzt") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "kube-api-access-bcvzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.234383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data" (OuterVolumeSpecName: "config-data") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.235385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04bd2c9e-9b28-45df-bbb7-4da844ddec3d" (UID: "04bd2c9e-9b28-45df-bbb7-4da844ddec3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297432 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297535 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297551 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297566 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcvzt\" (UniqueName: \"kubernetes.io/projected/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-kube-api-access-bcvzt\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297580 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:07 crc kubenswrapper[4795]: I1205 08:44:07.297592 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04bd2c9e-9b28-45df-bbb7-4da844ddec3d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.137150 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rsp78"] Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.144672 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rsp78"] Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.231355 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cg5dx"] Dec 05 08:44:08 crc kubenswrapper[4795]: E1205 08:44:08.231879 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04bd2c9e-9b28-45df-bbb7-4da844ddec3d" containerName="keystone-bootstrap" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.231896 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04bd2c9e-9b28-45df-bbb7-4da844ddec3d" containerName="keystone-bootstrap" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.232120 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="04bd2c9e-9b28-45df-bbb7-4da844ddec3d" containerName="keystone-bootstrap" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.232873 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.237008 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.237007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.237695 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.242042 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cg5dx"] Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.242676 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.242718 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c75xn" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.422851 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddb28\" (UniqueName: \"kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.422999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.423109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.423204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.423226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.423417 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.524959 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.525047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddb28\" (UniqueName: \"kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.525106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.525132 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.525173 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.525191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.534370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.534724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.534808 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.537879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.545949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.555216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddb28\" (UniqueName: \"kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28\") pod \"keystone-bootstrap-cg5dx\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.790188 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04bd2c9e-9b28-45df-bbb7-4da844ddec3d" path="/var/lib/kubelet/pods/04bd2c9e-9b28-45df-bbb7-4da844ddec3d/volumes" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.854382 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c75xn" Dec 05 08:44:08 crc kubenswrapper[4795]: I1205 08:44:08.862198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:09 crc kubenswrapper[4795]: I1205 08:44:09.674668 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Dec 05 08:44:10 crc kubenswrapper[4795]: I1205 08:44:10.828051 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:44:10 crc kubenswrapper[4795]: I1205 08:44:10.828146 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.971295 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.972077 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bch574h5hfch55h54h5f9h648h58bh65dh659h5bch58ch74h5b4h78h8fhb7h695h85h587h598h65h54ch587h5cdh64dhb6h58bhffh8bh69q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l44qb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5567c5f465-m6ld7_openstack(7558ab62-bbe2-431f-a0e7-c8fc78a49fd4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.975760 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5567c5f465-m6ld7" podUID="7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.989619 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.989870 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68h5dh549hb5h68dh96h67h595h5dfh547h5d7h586hch555h578h94h5bfh5cbh64fh567h567h96h57fh65fh5d9hc4h669h5d8h587h5bdh566h5d5q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5v452,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-649bd54f-sw6jd_openstack(ffa70a61-cdc5-40fa-a9bd-338a244659c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:13 crc kubenswrapper[4795]: E1205 08:44:13.992473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-649bd54f-sw6jd" podUID="ffa70a61-cdc5-40fa-a9bd-338a244659c4" Dec 05 08:44:14 crc kubenswrapper[4795]: E1205 08:44:14.003900 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 05 08:44:14 crc kubenswrapper[4795]: E1205 08:44:14.004166 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h64ch645h58h5bch5d7h575hch555h655h56bh64fhb4h697h67dh68hc9h5c5hb4h8dh694h678h5c8h699h87h556h557h5ffh5f7h684h79h689q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7h8zh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-665c6496b5-rcd65_openstack(7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:14 crc kubenswrapper[4795]: E1205 08:44:14.006799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-665c6496b5-rcd65" podUID="7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" Dec 05 08:44:14 crc kubenswrapper[4795]: I1205 08:44:14.674683 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Dec 05 08:44:16 crc kubenswrapper[4795]: E1205 08:44:16.563256 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 05 08:44:16 crc kubenswrapper[4795]: E1205 08:44:16.564371 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8fh5fbh77h575h5f8h549h5ddh694h55h588h5bdh5h554h689h585h696h677hc9hd6h75hdfhc5h558hf6h64dhd4h65h68bh98h65ch5d5h647q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wcj8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b1f0a01c-4d6c-4534-950a-699df43b935a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:20 crc kubenswrapper[4795]: I1205 08:44:20.283490 4795 generic.go:334] "Generic (PLEG): container finished" podID="7de95e6b-d594-4ed4-8b8d-041346856347" containerID="21ba3b63122aef1bb5134dde6523b67bf9263ad14895e2960724b5a4d48c62a2" exitCode=0 Dec 05 08:44:20 crc kubenswrapper[4795]: I1205 08:44:20.283596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxt72" event={"ID":"7de95e6b-d594-4ed4-8b8d-041346856347","Type":"ContainerDied","Data":"21ba3b63122aef1bb5134dde6523b67bf9263ad14895e2960724b5a4d48c62a2"} Dec 05 08:44:24 crc kubenswrapper[4795]: I1205 08:44:24.673504 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 08:44:24 crc kubenswrapper[4795]: I1205 08:44:24.675413 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.234184 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.268959 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.282222 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.296927 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data\") pod \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.296998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data\") pod \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.297103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts\") pod \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.297137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key\") pod \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.297241 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs\") pod \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.297959 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs" (OuterVolumeSpecName: "logs") pod "ffa70a61-cdc5-40fa-a9bd-338a244659c4" (UID: "ffa70a61-cdc5-40fa-a9bd-338a244659c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298170 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data" (OuterVolumeSpecName: "config-data") pod "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" (UID: "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data" (OuterVolumeSpecName: "config-data") pod "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" (UID: "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts" (OuterVolumeSpecName: "scripts") pod "ffa70a61-cdc5-40fa-a9bd-338a244659c4" (UID: "ffa70a61-cdc5-40fa-a9bd-338a244659c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h8zh\" (UniqueName: \"kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh\") pod \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key\") pod \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l44qb\" (UniqueName: \"kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb\") pod \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298526 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs\") pod \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298640 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data\") pod \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298698 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key\") pod \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v452\" (UniqueName: \"kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452\") pod \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\" (UID: \"ffa70a61-cdc5-40fa-a9bd-338a244659c4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts\") pod \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\" (UID: \"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts\") pod \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.298916 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs\") pod \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\" (UID: \"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff\") " Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.299796 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.299827 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.299842 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.299856 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ffa70a61-cdc5-40fa-a9bd-338a244659c4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.301176 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts" (OuterVolumeSpecName: "scripts") pod "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" (UID: "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.306061 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs" (OuterVolumeSpecName: "logs") pod "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" (UID: "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.306747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts" (OuterVolumeSpecName: "scripts") pod "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" (UID: "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.308438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ffa70a61-cdc5-40fa-a9bd-338a244659c4" (UID: "ffa70a61-cdc5-40fa-a9bd-338a244659c4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.309762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs" (OuterVolumeSpecName: "logs") pod "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" (UID: "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.312109 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452" (OuterVolumeSpecName: "kube-api-access-5v452") pod "ffa70a61-cdc5-40fa-a9bd-338a244659c4" (UID: "ffa70a61-cdc5-40fa-a9bd-338a244659c4"). InnerVolumeSpecName "kube-api-access-5v452". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.320405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data" (OuterVolumeSpecName: "config-data") pod "ffa70a61-cdc5-40fa-a9bd-338a244659c4" (UID: "ffa70a61-cdc5-40fa-a9bd-338a244659c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.334047 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" (UID: "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.336200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh" (OuterVolumeSpecName: "kube-api-access-7h8zh") pod "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" (UID: "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff"). InnerVolumeSpecName "kube-api-access-7h8zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.336874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" (UID: "7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.343666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb" (OuterVolumeSpecName: "kube-api-access-l44qb") pod "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" (UID: "7558ab62-bbe2-431f-a0e7-c8fc78a49fd4"). InnerVolumeSpecName "kube-api-access-l44qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.353695 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-649bd54f-sw6jd" event={"ID":"ffa70a61-cdc5-40fa-a9bd-338a244659c4","Type":"ContainerDied","Data":"731411a3a0c93a7ae6e24f36a11e9ef5c338429e82eb8d6c1722cca2db523713"} Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.353802 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-649bd54f-sw6jd" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.357335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-665c6496b5-rcd65" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.357397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-665c6496b5-rcd65" event={"ID":"7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff","Type":"ContainerDied","Data":"b60b7281f4294237813b8573c9fdbfca9fb443e275b0d406005dfca4d8609c7b"} Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.361348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5567c5f465-m6ld7" event={"ID":"7558ab62-bbe2-431f-a0e7-c8fc78a49fd4","Type":"ContainerDied","Data":"5d5607515e2b4b1c04915bd7861edd9c1c8f8069c5c6aa63dcaa14123331ab36"} Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.361643 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5567c5f465-m6ld7" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.403965 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h8zh\" (UniqueName: \"kubernetes.io/projected/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-kube-api-access-7h8zh\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404023 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404036 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l44qb\" (UniqueName: \"kubernetes.io/projected/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-kube-api-access-l44qb\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404047 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404060 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ffa70a61-cdc5-40fa-a9bd-338a244659c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404069 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ffa70a61-cdc5-40fa-a9bd-338a244659c4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404079 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v452\" (UniqueName: \"kubernetes.io/projected/ffa70a61-cdc5-40fa-a9bd-338a244659c4-kube-api-access-5v452\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404088 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404103 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.404111 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.460492 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.494950 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-649bd54f-sw6jd"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.546851 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.546926 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5567c5f465-m6ld7"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.566389 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.574327 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-665c6496b5-rcd65"] Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.760132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7558ab62-bbe2-431f-a0e7-c8fc78a49fd4" path="/var/lib/kubelet/pods/7558ab62-bbe2-431f-a0e7-c8fc78a49fd4/volumes" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.760698 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff" path="/var/lib/kubelet/pods/7f3d42fb-b0fc-4388-b7fc-17ca1fb7a4ff/volumes" Dec 05 08:44:26 crc kubenswrapper[4795]: I1205 08:44:26.761142 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa70a61-cdc5-40fa-a9bd-338a244659c4" path="/var/lib/kubelet/pods/ffa70a61-cdc5-40fa-a9bd-338a244659c4/volumes" Dec 05 08:44:26 crc kubenswrapper[4795]: E1205 08:44:26.924711 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 05 08:44:26 crc kubenswrapper[4795]: E1205 08:44:26.924900 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6lbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-k28kg_openstack(44d62cd1-585f-4756-b3f9-6f0714ea3248): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:26 crc kubenswrapper[4795]: E1205 08:44:26.928112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-k28kg" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.000115 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.031088 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxt72" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037441 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grg6r\" (UniqueName: \"kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.037824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc\") pod \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\" (UID: \"b6edcee3-faa8-4de5-8a83-d0dd6803844a\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.043063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r" (OuterVolumeSpecName: "kube-api-access-grg6r") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "kube-api-access-grg6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.121641 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.132716 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config" (OuterVolumeSpecName: "config") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.137632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.139519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config\") pod \"7de95e6b-d594-4ed4-8b8d-041346856347\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.139856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle\") pod \"7de95e6b-d594-4ed4-8b8d-041346856347\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.139945 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p248j\" (UniqueName: \"kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j\") pod \"7de95e6b-d594-4ed4-8b8d-041346856347\" (UID: \"7de95e6b-d594-4ed4-8b8d-041346856347\") " Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.140958 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.140987 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.141001 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grg6r\" (UniqueName: \"kubernetes.io/projected/b6edcee3-faa8-4de5-8a83-d0dd6803844a-kube-api-access-grg6r\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.141012 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.146060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.152928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j" (OuterVolumeSpecName: "kube-api-access-p248j") pod "7de95e6b-d594-4ed4-8b8d-041346856347" (UID: "7de95e6b-d594-4ed4-8b8d-041346856347"). InnerVolumeSpecName "kube-api-access-p248j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.160204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6edcee3-faa8-4de5-8a83-d0dd6803844a" (UID: "b6edcee3-faa8-4de5-8a83-d0dd6803844a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.183852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config" (OuterVolumeSpecName: "config") pod "7de95e6b-d594-4ed4-8b8d-041346856347" (UID: "7de95e6b-d594-4ed4-8b8d-041346856347"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.184272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7de95e6b-d594-4ed4-8b8d-041346856347" (UID: "7de95e6b-d594-4ed4-8b8d-041346856347"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.242742 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.242788 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p248j\" (UniqueName: \"kubernetes.io/projected/7de95e6b-d594-4ed4-8b8d-041346856347-kube-api-access-p248j\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.242803 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.242813 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7de95e6b-d594-4ed4-8b8d-041346856347-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.242824 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6edcee3-faa8-4de5-8a83-d0dd6803844a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.373957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" event={"ID":"b6edcee3-faa8-4de5-8a83-d0dd6803844a","Type":"ContainerDied","Data":"7ce6d361889f8f87d0548b559345046389ab975b8ff2508e7fd13c663e0beff4"} Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.374004 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.374028 4795 scope.go:117] "RemoveContainer" containerID="538a6868476e70c73ffd2e7f712d1babf028ebe8555d570de98a39598dc05838" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.379677 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-bxt72" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.381009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-bxt72" event={"ID":"7de95e6b-d594-4ed4-8b8d-041346856347","Type":"ContainerDied","Data":"e825e4fbaedb8e968ed22153cc1de548ce30d2362815f0d54881c381a2052a21"} Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.381057 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e825e4fbaedb8e968ed22153cc1de548ce30d2362815f0d54881c381a2052a21" Dec 05 08:44:27 crc kubenswrapper[4795]: E1205 08:44:27.385437 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-k28kg" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.431473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:44:27 crc kubenswrapper[4795]: I1205 08:44:27.439882 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-ztw96"] Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410104 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:44:28 crc kubenswrapper[4795]: E1205 08:44:28.410574 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410589 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" Dec 05 08:44:28 crc kubenswrapper[4795]: E1205 08:44:28.410606 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="init" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410626 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="init" Dec 05 08:44:28 crc kubenswrapper[4795]: E1205 08:44:28.410659 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de95e6b-d594-4ed4-8b8d-041346856347" containerName="neutron-db-sync" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410667 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de95e6b-d594-4ed4-8b8d-041346856347" containerName="neutron-db-sync" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410837 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de95e6b-d594-4ed4-8b8d-041346856347" containerName="neutron-db-sync" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.410856 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.411977 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.476878 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgwnb\" (UniqueName: \"kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495340 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495430 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.495524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.588564 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.591700 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgwnb\" (UniqueName: \"kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603329 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.603445 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.605023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rls2n" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.605839 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.606191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.606227 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.606293 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.607098 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.614383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.614802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.618556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.624152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.648270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgwnb\" (UniqueName: \"kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb\") pod \"dnsmasq-dns-6b7b667979-6t2ng\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.706100 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.706169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.706199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.706221 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.706263 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkr2p\" (UniqueName: \"kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.769800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.779898 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" path="/var/lib/kubelet/pods/b6edcee3-faa8-4de5-8a83-d0dd6803844a/volumes" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.834440 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.834592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.834697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.834789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkr2p\" (UniqueName: \"kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.834958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.846098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.857982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkr2p\" (UniqueName: \"kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.858207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.869730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.870274 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs\") pod \"neutron-666967b744-rtdmp\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:28 crc kubenswrapper[4795]: I1205 08:44:28.949637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:29 crc kubenswrapper[4795]: I1205 08:44:29.674846 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-ztw96" podUID="b6edcee3-faa8-4de5-8a83-d0dd6803844a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.843834 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6766d78d6c-l65vj"] Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.846029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.848097 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.849922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.872348 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6766d78d6c-l65vj"] Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.994838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-internal-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.994906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-ovndb-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.995192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftkcp\" (UniqueName: \"kubernetes.io/projected/8ada623e-3e62-480c-a681-19685e13dc82-kube-api-access-ftkcp\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.995355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-combined-ca-bundle\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.995443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-httpd-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.995596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:30 crc kubenswrapper[4795]: I1205 08:44:30.995845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-public-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.097977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-internal-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-ovndb-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftkcp\" (UniqueName: \"kubernetes.io/projected/8ada623e-3e62-480c-a681-19685e13dc82-kube-api-access-ftkcp\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098170 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-combined-ca-bundle\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098203 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-httpd-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.098270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-public-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.105789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.106929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-public-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.108047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-ovndb-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.108763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-combined-ca-bundle\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.112017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-internal-tls-certs\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.115566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ada623e-3e62-480c-a681-19685e13dc82-httpd-config\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.117487 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftkcp\" (UniqueName: \"kubernetes.io/projected/8ada623e-3e62-480c-a681-19685e13dc82-kube-api-access-ftkcp\") pod \"neutron-6766d78d6c-l65vj\" (UID: \"8ada623e-3e62-480c-a681-19685e13dc82\") " pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:31 crc kubenswrapper[4795]: I1205 08:44:31.172201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:34 crc kubenswrapper[4795]: E1205 08:44:34.344328 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 05 08:44:34 crc kubenswrapper[4795]: E1205 08:44:34.345289 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lxztf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-cq4gw_openstack(dd6ce9d5-263a-4b05-83e5-c349f0038001): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:44:34 crc kubenswrapper[4795]: E1205 08:44:34.346795 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-cq4gw" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" Dec 05 08:44:34 crc kubenswrapper[4795]: E1205 08:44:34.463480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-cq4gw" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" Dec 05 08:44:34 crc kubenswrapper[4795]: I1205 08:44:34.836113 4795 scope.go:117] "RemoveContainer" containerID="2bacd66b57a0eb329290b8d086ff0982116af038de1537aad1db866f36a93dc1" Dec 05 08:44:34 crc kubenswrapper[4795]: I1205 08:44:34.846084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57b485fdb4-h9cjs"] Dec 05 08:44:34 crc kubenswrapper[4795]: W1205 08:44:34.894216 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf89d9173_0065_4beb_a1b6_ba7be5094a58.slice/crio-c8157297444c3431d6584bc816c81caf9375d59f1dae5eefce777fd1401f0ebb WatchSource:0}: Error finding container c8157297444c3431d6584bc816c81caf9375d59f1dae5eefce777fd1401f0ebb: Status 404 returned error can't find the container with id c8157297444c3431d6584bc816c81caf9375d59f1dae5eefce777fd1401f0ebb Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.356710 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.492935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerStarted","Data":"bfeb06ca5f52c1d93ed4a19e827c425fd32bb5657d8469e4f506835009eb5f0b"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.511797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nxkbh" event={"ID":"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87","Type":"ContainerStarted","Data":"0da4fdb12c2d95e3f2dde1f024cce45891fc2bf5f879e6a7cc473244c713f96d"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.547196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"938a4b107fced14e75138c8bdcef48677332ceeefa2f85fe3827d4dac8041a84"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.547744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"c8157297444c3431d6584bc816c81caf9375d59f1dae5eefce777fd1401f0ebb"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.569328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerStarted","Data":"a15485c5ca1c12de33abea552ef3854d359f6cf021bfbc8c4f438f477a74f382"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.591195 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f"} Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.621183 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.625963 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-nxkbh" podStartSLOduration=12.727898836 podStartE2EDuration="47.625937325s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="2025-12-05 08:43:51.174273038 +0000 UTC m=+1182.746876777" lastFinishedPulling="2025-12-05 08:44:26.072311527 +0000 UTC m=+1217.644915266" observedRunningTime="2025-12-05 08:44:35.543156805 +0000 UTC m=+1227.115760544" watchObservedRunningTime="2025-12-05 08:44:35.625937325 +0000 UTC m=+1227.198541064" Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.689035 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cg5dx"] Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.718830 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.801735 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6766d78d6c-l65vj"] Dec 05 08:44:35 crc kubenswrapper[4795]: I1205 08:44:35.900839 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:44:35 crc kubenswrapper[4795]: W1205 08:44:35.913450 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19ddeb6e_4006_4919_86f7_e82748bd655f.slice/crio-ac7a25af40060a927c0191a9d7f6caf40327fe6b0db0e7d151eeba8f1682fd45 WatchSource:0}: Error finding container ac7a25af40060a927c0191a9d7f6caf40327fe6b0db0e7d151eeba8f1682fd45: Status 404 returned error can't find the container with id ac7a25af40060a927c0191a9d7f6caf40327fe6b0db0e7d151eeba8f1682fd45 Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.023780 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:44:36 crc kubenswrapper[4795]: W1205 08:44:36.051029 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1720beb5_14de_4db2_9581_945e2f781500.slice/crio-428e413ceb1b6f2f071e16ab6f46d2b02dc5241c355690824c040ed4c24a1634 WatchSource:0}: Error finding container 428e413ceb1b6f2f071e16ab6f46d2b02dc5241c355690824c040ed4c24a1634: Status 404 returned error can't find the container with id 428e413ceb1b6f2f071e16ab6f46d2b02dc5241c355690824c040ed4c24a1634 Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.605738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerStarted","Data":"428e413ceb1b6f2f071e16ab6f46d2b02dc5241c355690824c040ed4c24a1634"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.609317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.648898 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57b485fdb4-h9cjs" podStartSLOduration=37.648870799 podStartE2EDuration="37.648870799s" podCreationTimestamp="2025-12-05 08:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:36.646596147 +0000 UTC m=+1228.219199886" watchObservedRunningTime="2025-12-05 08:44:36.648870799 +0000 UTC m=+1228.221474538" Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.658839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cg5dx" event={"ID":"b05e0563-19d6-439b-b9d2-d241537794c4","Type":"ContainerStarted","Data":"46fa3526494c87f64a30b69fcb7bba963258964ce3e90db0536aca8a8bd359f6"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.658910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cg5dx" event={"ID":"b05e0563-19d6-439b-b9d2-d241537794c4","Type":"ContainerStarted","Data":"1fb08649f023d8f9e1667764cfd621ae9d079c7242863eac73533f2a845119c5"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.682164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6766d78d6c-l65vj" event={"ID":"8ada623e-3e62-480c-a681-19685e13dc82","Type":"ContainerStarted","Data":"30120b5e788c186377f59bbfe031926647f9f81cb22c52c06b7c9fff759a0059"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.682235 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6766d78d6c-l65vj" event={"ID":"8ada623e-3e62-480c-a681-19685e13dc82","Type":"ContainerStarted","Data":"030392e53d723a9e34025d6420943abd9f0e6db4456e96f280d4eeeae46f5a97"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.696366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.700983 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cg5dx" podStartSLOduration=28.700954476 podStartE2EDuration="28.700954476s" podCreationTimestamp="2025-12-05 08:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:36.682285325 +0000 UTC m=+1228.254889064" watchObservedRunningTime="2025-12-05 08:44:36.700954476 +0000 UTC m=+1228.273558225" Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.709978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerStarted","Data":"571da090e601e3d2e50598dad2aa0efd39a2de925f2b4138736154be418c654a"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.738036 4795 generic.go:334] "Generic (PLEG): container finished" podID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerID="a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166" exitCode=0 Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.740090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" event={"ID":"19ddeb6e-4006-4919-86f7-e82748bd655f","Type":"ContainerDied","Data":"a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.740133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" event={"ID":"19ddeb6e-4006-4919-86f7-e82748bd655f","Type":"ContainerStarted","Data":"ac7a25af40060a927c0191a9d7f6caf40327fe6b0db0e7d151eeba8f1682fd45"} Dec 05 08:44:36 crc kubenswrapper[4795]: I1205 08:44:36.767939 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-797f5f5996-7wlp4" podStartSLOduration=7.028455043 podStartE2EDuration="37.767921343s" podCreationTimestamp="2025-12-05 08:43:59 +0000 UTC" firstStartedPulling="2025-12-05 08:44:04.018303513 +0000 UTC m=+1195.590907252" lastFinishedPulling="2025-12-05 08:44:34.757769813 +0000 UTC m=+1226.330373552" observedRunningTime="2025-12-05 08:44:36.722265137 +0000 UTC m=+1228.294868886" watchObservedRunningTime="2025-12-05 08:44:36.767921343 +0000 UTC m=+1228.340525082" Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.800098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerStarted","Data":"e7c2f5e493d5e22571e238563f990d6c647a80e9da152a29cf8facde3f8373f5"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.838766 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" event={"ID":"19ddeb6e-4006-4919-86f7-e82748bd655f","Type":"ContainerStarted","Data":"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.839070 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.852911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerStarted","Data":"12e4e7294c0ff8f1b5ed6cc6e1b6d86706e61eda5b1e625a3bcc1b4598aa4f83"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.852962 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerStarted","Data":"f2185aa8a63a3927e1c97ff0f58b9a0575544b153ace41bfc8af687c1f1343e8"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.853179 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.871300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6766d78d6c-l65vj" event={"ID":"8ada623e-3e62-480c-a681-19685e13dc82","Type":"ContainerStarted","Data":"9007174b06db436ca6b440d700bba1640730699a6d2c9652759315036f5f2da2"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.872448 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.876998 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" podStartSLOduration=9.876978107 podStartE2EDuration="9.876978107s" podCreationTimestamp="2025-12-05 08:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:37.875445576 +0000 UTC m=+1229.448049315" watchObservedRunningTime="2025-12-05 08:44:37.876978107 +0000 UTC m=+1229.449581846" Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.897199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerStarted","Data":"d2c608376194ede01479d346c99e7e45d06925ba63944024e044cd8d0dc920d7"} Dec 05 08:44:37 crc kubenswrapper[4795]: I1205 08:44:37.923011 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-666967b744-rtdmp" podStartSLOduration=9.922966481 podStartE2EDuration="9.922966481s" podCreationTimestamp="2025-12-05 08:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:37.90205188 +0000 UTC m=+1229.474655619" watchObservedRunningTime="2025-12-05 08:44:37.922966481 +0000 UTC m=+1229.495570220" Dec 05 08:44:38 crc kubenswrapper[4795]: I1205 08:44:38.792864 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6766d78d6c-l65vj" podStartSLOduration=8.792836258 podStartE2EDuration="8.792836258s" podCreationTimestamp="2025-12-05 08:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:37.93896024 +0000 UTC m=+1229.511563979" watchObservedRunningTime="2025-12-05 08:44:38.792836258 +0000 UTC m=+1230.365439987" Dec 05 08:44:38 crc kubenswrapper[4795]: I1205 08:44:38.924752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerStarted","Data":"81a78d568bbd90cc1466533a14180c78a16082a4e4e8ba33b8591288f81dfc42"} Dec 05 08:44:38 crc kubenswrapper[4795]: I1205 08:44:38.947929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerStarted","Data":"4b6bfe84272ec2a162ad7e434e9aaf9bb544e0f2c86fd535921bf973033ee045"} Dec 05 08:44:38 crc kubenswrapper[4795]: I1205 08:44:38.968469 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=38.968444659 podStartE2EDuration="38.968444659s" podCreationTimestamp="2025-12-05 08:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:38.959775276 +0000 UTC m=+1230.532379015" watchObservedRunningTime="2025-12-05 08:44:38.968444659 +0000 UTC m=+1230.541048398" Dec 05 08:44:39 crc kubenswrapper[4795]: I1205 08:44:39.004635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=39.004591829 podStartE2EDuration="39.004591829s" podCreationTimestamp="2025-12-05 08:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:38.99830562 +0000 UTC m=+1230.570909359" watchObservedRunningTime="2025-12-05 08:44:39.004591829 +0000 UTC m=+1230.577195568" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.034454 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.039082 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.361101 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.362444 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.656883 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.656941 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.713676 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.777155 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.829919 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.830010 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.884695 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.884754 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.923696 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.970188 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.988288 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.988885 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.988916 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 08:44:40 crc kubenswrapper[4795]: I1205 08:44:40.988933 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:42 crc kubenswrapper[4795]: I1205 08:44:42.000008 4795 generic.go:334] "Generic (PLEG): container finished" podID="d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" containerID="0da4fdb12c2d95e3f2dde1f024cce45891fc2bf5f879e6a7cc473244c713f96d" exitCode=0 Dec 05 08:44:42 crc kubenswrapper[4795]: I1205 08:44:42.000178 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nxkbh" event={"ID":"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87","Type":"ContainerDied","Data":"0da4fdb12c2d95e3f2dde1f024cce45891fc2bf5f879e6a7cc473244c713f96d"} Dec 05 08:44:43 crc kubenswrapper[4795]: I1205 08:44:43.782122 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:44:43 crc kubenswrapper[4795]: I1205 08:44:43.907121 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:44:43 crc kubenswrapper[4795]: I1205 08:44:43.907496 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="dnsmasq-dns" containerID="cri-o://ef7cc32e1710040e07241a8eae09249e3ff7a8b77c91d42ca6b651731e32f350" gracePeriod=10 Dec 05 08:44:44 crc kubenswrapper[4795]: I1205 08:44:44.032216 4795 generic.go:334] "Generic (PLEG): container finished" podID="b05e0563-19d6-439b-b9d2-d241537794c4" containerID="46fa3526494c87f64a30b69fcb7bba963258964ce3e90db0536aca8a8bd359f6" exitCode=0 Dec 05 08:44:44 crc kubenswrapper[4795]: I1205 08:44:44.032281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cg5dx" event={"ID":"b05e0563-19d6-439b-b9d2-d241537794c4","Type":"ContainerDied","Data":"46fa3526494c87f64a30b69fcb7bba963258964ce3e90db0536aca8a8bd359f6"} Dec 05 08:44:44 crc kubenswrapper[4795]: I1205 08:44:44.934378 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Dec 05 08:44:45 crc kubenswrapper[4795]: I1205 08:44:45.044106 4795 generic.go:334] "Generic (PLEG): container finished" podID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerID="ef7cc32e1710040e07241a8eae09249e3ff7a8b77c91d42ca6b651731e32f350" exitCode=0 Dec 05 08:44:45 crc kubenswrapper[4795]: I1205 08:44:45.044187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" event={"ID":"83eff3d1-a4b3-4d67-9131-57df81514ccb","Type":"ContainerDied","Data":"ef7cc32e1710040e07241a8eae09249e3ff7a8b77c91d42ca6b651731e32f350"} Dec 05 08:44:46 crc kubenswrapper[4795]: I1205 08:44:46.647233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:46 crc kubenswrapper[4795]: I1205 08:44:46.647676 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:44:46 crc kubenswrapper[4795]: I1205 08:44:46.652922 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 08:44:46 crc kubenswrapper[4795]: I1205 08:44:46.669946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.087482 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.883467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.894900 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nxkbh" Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.984515 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.984941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985110 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts\") pod \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs\") pod \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985529 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data\") pod \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985695 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.985834 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.986003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle\") pod \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.986078 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvd7w\" (UniqueName: \"kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w\") pod \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\" (UID: \"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87\") " Dec 05 08:44:47 crc kubenswrapper[4795]: I1205 08:44:47.986154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddb28\" (UniqueName: \"kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28\") pod \"b05e0563-19d6-439b-b9d2-d241537794c4\" (UID: \"b05e0563-19d6-439b-b9d2-d241537794c4\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.002484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts" (OuterVolumeSpecName: "scripts") pod "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" (UID: "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.014808 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs" (OuterVolumeSpecName: "logs") pod "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" (UID: "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.024806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28" (OuterVolumeSpecName: "kube-api-access-ddb28") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "kube-api-access-ddb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.054121 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.060324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w" (OuterVolumeSpecName: "kube-api-access-gvd7w") pod "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" (UID: "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87"). InnerVolumeSpecName "kube-api-access-gvd7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.079788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts" (OuterVolumeSpecName: "scripts") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093115 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvd7w\" (UniqueName: \"kubernetes.io/projected/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-kube-api-access-gvd7w\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddb28\" (UniqueName: \"kubernetes.io/projected/b05e0563-19d6-439b-b9d2-d241537794c4-kube-api-access-ddb28\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093175 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093185 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093196 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.093204 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.112764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-nxkbh" event={"ID":"d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87","Type":"ContainerDied","Data":"2a40f844943afb15b416942f6ae9fad0e63fe182cead92a68aa515b73d6ca0b4"} Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.112816 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a40f844943afb15b416942f6ae9fad0e63fe182cead92a68aa515b73d6ca0b4" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.112963 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-nxkbh" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.114941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cg5dx" event={"ID":"b05e0563-19d6-439b-b9d2-d241537794c4","Type":"ContainerDied","Data":"1fb08649f023d8f9e1667764cfd621ae9d079c7242863eac73533f2a845119c5"} Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.114972 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fb08649f023d8f9e1667764cfd621ae9d079c7242863eac73533f2a845119c5" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.115030 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cg5dx" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.152237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.157259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" (UID: "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.163464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data" (OuterVolumeSpecName: "config-data") pod "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" (UID: "d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.163662 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.196174 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.236197 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.236233 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.236525 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.202111 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data" (OuterVolumeSpecName: "config-data") pod "b05e0563-19d6-439b-b9d2-d241537794c4" (UID: "b05e0563-19d6-439b-b9d2-d241537794c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.338216 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b05e0563-19d6-439b-b9d2-d241537794c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.457246 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541171 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541380 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541426 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.541572 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rqrs\" (UniqueName: \"kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs\") pod \"83eff3d1-a4b3-4d67-9131-57df81514ccb\" (UID: \"83eff3d1-a4b3-4d67-9131-57df81514ccb\") " Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.556546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs" (OuterVolumeSpecName: "kube-api-access-5rqrs") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "kube-api-access-5rqrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.646077 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rqrs\" (UniqueName: \"kubernetes.io/projected/83eff3d1-a4b3-4d67-9131-57df81514ccb-kube-api-access-5rqrs\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.683563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.690483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.703356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.711383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.716163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config" (OuterVolumeSpecName: "config") pod "83eff3d1-a4b3-4d67-9131-57df81514ccb" (UID: "83eff3d1-a4b3-4d67-9131-57df81514ccb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.760524 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.761212 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.761249 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.761262 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:48 crc kubenswrapper[4795]: I1205 08:44:48.761272 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83eff3d1-a4b3-4d67-9131-57df81514ccb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.126882 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" event={"ID":"83eff3d1-a4b3-4d67-9131-57df81514ccb","Type":"ContainerDied","Data":"8a61dbbc97d61fa0a96e35176ab5f32d28f05f44bf1f540db36f93cf1ef089d8"} Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.126940 4795 scope.go:117] "RemoveContainer" containerID="ef7cc32e1710040e07241a8eae09249e3ff7a8b77c91d42ca6b651731e32f350" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.127090 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-jtjqv" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.194731 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.232705 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-jtjqv"] Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.274818 4795 scope.go:117] "RemoveContainer" containerID="b06ef4451be2a4a4ab957d3d2824334f89c2f97e2f81be14b44a905a65540d3b" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287082 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85d5d69654-vzspj"] Dec 05 08:44:49 crc kubenswrapper[4795]: E1205 08:44:49.287601 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="init" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="init" Dec 05 08:44:49 crc kubenswrapper[4795]: E1205 08:44:49.287664 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" containerName="placement-db-sync" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287671 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" containerName="placement-db-sync" Dec 05 08:44:49 crc kubenswrapper[4795]: E1205 08:44:49.287690 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="dnsmasq-dns" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287697 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="dnsmasq-dns" Dec 05 08:44:49 crc kubenswrapper[4795]: E1205 08:44:49.287707 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05e0563-19d6-439b-b9d2-d241537794c4" containerName="keystone-bootstrap" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287717 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05e0563-19d6-439b-b9d2-d241537794c4" containerName="keystone-bootstrap" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287944 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" containerName="dnsmasq-dns" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287962 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" containerName="placement-db-sync" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.287974 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05e0563-19d6-439b-b9d2-d241537794c4" containerName="keystone-bootstrap" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.288840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.302132 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.302488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.302693 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-c75xn" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.304850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.305044 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.306419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.314539 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d5d69654-vzspj"] Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.370959 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56588789f4-7xbdx"] Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-credential-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6hx\" (UniqueName: \"kubernetes.io/projected/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-kube-api-access-nz6hx\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-config-data\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-scripts\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377671 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-internal-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-fernet-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-combined-ca-bundle\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.377759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-public-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.385902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.394401 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rkxnk" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.394812 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.394954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.395066 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.395217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.427467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56588789f4-7xbdx"] Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.503016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-combined-ca-bundle\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.503094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-public-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.503152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40ecc6c1-814a-40dc-988b-d4b67a58794b-logs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.503195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-public-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.503243 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-credential-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.527278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-scripts\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.527461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-config-data\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.527553 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mgq5\" (UniqueName: \"kubernetes.io/projected/40ecc6c1-814a-40dc-988b-d4b67a58794b-kube-api-access-2mgq5\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.527635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6hx\" (UniqueName: \"kubernetes.io/projected/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-kube-api-access-nz6hx\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.527694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-config-data\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.534638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-scripts\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.534726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-internal-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.534818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-internal-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.534884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-fernet-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.535037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-combined-ca-bundle\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.547880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-fernet-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.601956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-internal-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.605470 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-scripts\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.605481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-combined-ca-bundle\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.625139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6hx\" (UniqueName: \"kubernetes.io/projected/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-kube-api-access-nz6hx\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.630495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-public-tls-certs\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.653297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-credential-keys\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.664561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0bb2937-2db6-41a5-b930-b1d479cd8a5f-config-data\") pod \"keystone-85d5d69654-vzspj\" (UID: \"f0bb2937-2db6-41a5-b930-b1d479cd8a5f\") " pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.664824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-scripts\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.664980 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-config-data\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.665075 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mgq5\" (UniqueName: \"kubernetes.io/projected/40ecc6c1-814a-40dc-988b-d4b67a58794b-kube-api-access-2mgq5\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.665179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-internal-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.665373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-combined-ca-bundle\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.665443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40ecc6c1-814a-40dc-988b-d4b67a58794b-logs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.665477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-public-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.671413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40ecc6c1-814a-40dc-988b-d4b67a58794b-logs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.688042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.711463 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-combined-ca-bundle\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.721714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-internal-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.722436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-config-data\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.728578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-public-tls-certs\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.730195 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40ecc6c1-814a-40dc-988b-d4b67a58794b-scripts\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.736391 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mgq5\" (UniqueName: \"kubernetes.io/projected/40ecc6c1-814a-40dc-988b-d4b67a58794b-kube-api-access-2mgq5\") pod \"placement-56588789f4-7xbdx\" (UID: \"40ecc6c1-814a-40dc-988b-d4b67a58794b\") " pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:49 crc kubenswrapper[4795]: I1205 08:44:49.803666 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.047563 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.165887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k28kg" event={"ID":"44d62cd1-585f-4756-b3f9-6f0714ea3248","Type":"ContainerStarted","Data":"3965dbcfed1102d4d5e89c40cecc086a0635b77949a3babca31ce6ffe5799f4a"} Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.260264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerStarted","Data":"2f805fdee004ebc6b849916b9f24e479c474d5e914e39cc4e2265bb6e1100ff2"} Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.427798 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.625424 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-k28kg" podStartSLOduration=13.731174262 podStartE2EDuration="1m2.625396896s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="2025-12-05 08:43:51.226600062 +0000 UTC m=+1182.799203801" lastFinishedPulling="2025-12-05 08:44:40.120822706 +0000 UTC m=+1231.693426435" observedRunningTime="2025-12-05 08:44:50.19988445 +0000 UTC m=+1241.772488189" watchObservedRunningTime="2025-12-05 08:44:50.625396896 +0000 UTC m=+1242.198000635" Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.634479 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56588789f4-7xbdx"] Dec 05 08:44:50 crc kubenswrapper[4795]: W1205 08:44:50.669731 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40ecc6c1_814a_40dc_988b_d4b67a58794b.slice/crio-59bf1308ad465b49ed5b809b058af5ab2158f707b41cf1af09062b44943a8326 WatchSource:0}: Error finding container 59bf1308ad465b49ed5b809b058af5ab2158f707b41cf1af09062b44943a8326: Status 404 returned error can't find the container with id 59bf1308ad465b49ed5b809b058af5ab2158f707b41cf1af09062b44943a8326 Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.807845 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83eff3d1-a4b3-4d67-9131-57df81514ccb" path="/var/lib/kubelet/pods/83eff3d1-a4b3-4d67-9131-57df81514ccb/volumes" Dec 05 08:44:50 crc kubenswrapper[4795]: I1205 08:44:50.809746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85d5d69654-vzspj"] Dec 05 08:44:51 crc kubenswrapper[4795]: I1205 08:44:51.282326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56588789f4-7xbdx" event={"ID":"40ecc6c1-814a-40dc-988b-d4b67a58794b","Type":"ContainerStarted","Data":"59bf1308ad465b49ed5b809b058af5ab2158f707b41cf1af09062b44943a8326"} Dec 05 08:44:51 crc kubenswrapper[4795]: I1205 08:44:51.302530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d5d69654-vzspj" event={"ID":"f0bb2937-2db6-41a5-b930-b1d479cd8a5f","Type":"ContainerStarted","Data":"363f5d6ef3982c3a8c15273ca2c12b016b6f20bbd724447c705d957ff6fd42cb"} Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.345774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cq4gw" event={"ID":"dd6ce9d5-263a-4b05-83e5-c349f0038001","Type":"ContainerStarted","Data":"48f628ae6feab2d2ac76c438de9dfdf7f23719be803014cdf3e16dcbb6e4ad36"} Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.368309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56588789f4-7xbdx" event={"ID":"40ecc6c1-814a-40dc-988b-d4b67a58794b","Type":"ContainerStarted","Data":"c1f6508f6fc1e1a3109de5ce734a48dd9feb02cbbf7e09a1a1b6ba122a585514"} Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.368690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56588789f4-7xbdx" event={"ID":"40ecc6c1-814a-40dc-988b-d4b67a58794b","Type":"ContainerStarted","Data":"a5eeab0e79011d5a3610140121c654516222decd94c433591ee19a018bd2a477"} Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.368861 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.368959 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.385750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85d5d69654-vzspj" event={"ID":"f0bb2937-2db6-41a5-b930-b1d479cd8a5f","Type":"ContainerStarted","Data":"545de74471ef44996d707ea048b26381eb4fa5b661806ac2c6f0445769920d29"} Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.386341 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.395308 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-cq4gw" podStartSLOduration=5.340571207 podStartE2EDuration="1m4.395277239s" podCreationTimestamp="2025-12-05 08:43:48 +0000 UTC" firstStartedPulling="2025-12-05 08:43:50.713283681 +0000 UTC m=+1182.285887420" lastFinishedPulling="2025-12-05 08:44:49.767989713 +0000 UTC m=+1241.340593452" observedRunningTime="2025-12-05 08:44:52.368060269 +0000 UTC m=+1243.940664008" watchObservedRunningTime="2025-12-05 08:44:52.395277239 +0000 UTC m=+1243.967880978" Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.438982 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56588789f4-7xbdx" podStartSLOduration=3.438951891 podStartE2EDuration="3.438951891s" podCreationTimestamp="2025-12-05 08:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:52.416333635 +0000 UTC m=+1243.988937384" watchObservedRunningTime="2025-12-05 08:44:52.438951891 +0000 UTC m=+1244.011555630" Dec 05 08:44:52 crc kubenswrapper[4795]: I1205 08:44:52.461738 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85d5d69654-vzspj" podStartSLOduration=3.461706152 podStartE2EDuration="3.461706152s" podCreationTimestamp="2025-12-05 08:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:44:52.444242233 +0000 UTC m=+1244.016845972" watchObservedRunningTime="2025-12-05 08:44:52.461706152 +0000 UTC m=+1244.034309891" Dec 05 08:44:56 crc kubenswrapper[4795]: I1205 08:44:56.453376 4795 generic.go:334] "Generic (PLEG): container finished" podID="44d62cd1-585f-4756-b3f9-6f0714ea3248" containerID="3965dbcfed1102d4d5e89c40cecc086a0635b77949a3babca31ce6ffe5799f4a" exitCode=0 Dec 05 08:44:56 crc kubenswrapper[4795]: I1205 08:44:56.454065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k28kg" event={"ID":"44d62cd1-585f-4756-b3f9-6f0714ea3248","Type":"ContainerDied","Data":"3965dbcfed1102d4d5e89c40cecc086a0635b77949a3babca31ce6ffe5799f4a"} Dec 05 08:44:58 crc kubenswrapper[4795]: I1205 08:44:58.963184 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.033842 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.149288 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b"] Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.150783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.154904 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.155339 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.217946 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b"] Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.300252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.300358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74blx\" (UniqueName: \"kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.300447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.359073 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.406298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.406439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74blx\" (UniqueName: \"kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.406593 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.408286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.420989 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.429316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74blx\" (UniqueName: \"kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx\") pod \"collect-profiles-29415405-dct2b\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:00 crc kubenswrapper[4795]: I1205 08:45:00.480128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:01 crc kubenswrapper[4795]: I1205 08:45:01.202356 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6766d78d6c-l65vj" Dec 05 08:45:01 crc kubenswrapper[4795]: I1205 08:45:01.342134 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:45:01 crc kubenswrapper[4795]: I1205 08:45:01.342478 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-666967b744-rtdmp" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-api" containerID="cri-o://f2185aa8a63a3927e1c97ff0f58b9a0575544b153ace41bfc8af687c1f1343e8" gracePeriod=30 Dec 05 08:45:01 crc kubenswrapper[4795]: I1205 08:45:01.343174 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-666967b744-rtdmp" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-httpd" containerID="cri-o://12e4e7294c0ff8f1b5ed6cc6e1b6d86706e61eda5b1e625a3bcc1b4598aa4f83" gracePeriod=30 Dec 05 08:45:02 crc kubenswrapper[4795]: I1205 08:45:02.561509 4795 generic.go:334] "Generic (PLEG): container finished" podID="dd6ce9d5-263a-4b05-83e5-c349f0038001" containerID="48f628ae6feab2d2ac76c438de9dfdf7f23719be803014cdf3e16dcbb6e4ad36" exitCode=0 Dec 05 08:45:02 crc kubenswrapper[4795]: I1205 08:45:02.561880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cq4gw" event={"ID":"dd6ce9d5-263a-4b05-83e5-c349f0038001","Type":"ContainerDied","Data":"48f628ae6feab2d2ac76c438de9dfdf7f23719be803014cdf3e16dcbb6e4ad36"} Dec 05 08:45:03 crc kubenswrapper[4795]: I1205 08:45:03.581103 4795 generic.go:334] "Generic (PLEG): container finished" podID="1720beb5-14de-4db2-9581-945e2f781500" containerID="12e4e7294c0ff8f1b5ed6cc6e1b6d86706e61eda5b1e625a3bcc1b4598aa4f83" exitCode=0 Dec 05 08:45:03 crc kubenswrapper[4795]: I1205 08:45:03.581280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerDied","Data":"12e4e7294c0ff8f1b5ed6cc6e1b6d86706e61eda5b1e625a3bcc1b4598aa4f83"} Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.518955 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k28kg" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.522066 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.616091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-k28kg" event={"ID":"44d62cd1-585f-4756-b3f9-6f0714ea3248","Type":"ContainerDied","Data":"8004cfea5dd019a63db6cf532b45cfde2e4bbac38423edb21273d54b43005f95"} Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.616635 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8004cfea5dd019a63db6cf532b45cfde2e4bbac38423edb21273d54b43005f95" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.616172 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-k28kg" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.623554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.623664 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6lbn\" (UniqueName: \"kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn\") pod \"44d62cd1-585f-4756-b3f9-6f0714ea3248\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.623716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data\") pod \"44d62cd1-585f-4756-b3f9-6f0714ea3248\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.623828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle\") pod \"44d62cd1-585f-4756-b3f9-6f0714ea3248\" (UID: \"44d62cd1-585f-4756-b3f9-6f0714ea3248\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.624027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.624086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.624108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxztf\" (UniqueName: \"kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.624140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.624166 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data\") pod \"dd6ce9d5-263a-4b05-83e5-c349f0038001\" (UID: \"dd6ce9d5-263a-4b05-83e5-c349f0038001\") " Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.625500 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.644227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-cq4gw" event={"ID":"dd6ce9d5-263a-4b05-83e5-c349f0038001","Type":"ContainerDied","Data":"179a1297321689ba17b297789e4615ffb4fa777af5a4d0361e911112e839038a"} Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.646887 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="179a1297321689ba17b297789e4615ffb4fa777af5a4d0361e911112e839038a" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.644568 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-cq4gw" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.663957 4795 generic.go:334] "Generic (PLEG): container finished" podID="1720beb5-14de-4db2-9581-945e2f781500" containerID="f2185aa8a63a3927e1c97ff0f58b9a0575544b153ace41bfc8af687c1f1343e8" exitCode=0 Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.664028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerDied","Data":"f2185aa8a63a3927e1c97ff0f58b9a0575544b153ace41bfc8af687c1f1343e8"} Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.664547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "44d62cd1-585f-4756-b3f9-6f0714ea3248" (UID: "44d62cd1-585f-4756-b3f9-6f0714ea3248"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.664837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.673142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf" (OuterVolumeSpecName: "kube-api-access-lxztf") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "kube-api-access-lxztf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.687870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.695190 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn" (OuterVolumeSpecName: "kube-api-access-j6lbn") pod "44d62cd1-585f-4756-b3f9-6f0714ea3248" (UID: "44d62cd1-585f-4756-b3f9-6f0714ea3248"). InnerVolumeSpecName "kube-api-access-j6lbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.712846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts" (OuterVolumeSpecName: "scripts") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728423 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728461 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728473 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxztf\" (UniqueName: \"kubernetes.io/projected/dd6ce9d5-263a-4b05-83e5-c349f0038001-kube-api-access-lxztf\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728488 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dd6ce9d5-263a-4b05-83e5-c349f0038001-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728498 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728507 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6lbn\" (UniqueName: \"kubernetes.io/projected/44d62cd1-585f-4756-b3f9-6f0714ea3248-kube-api-access-j6lbn\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.728516 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.767828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44d62cd1-585f-4756-b3f9-6f0714ea3248" (UID: "44d62cd1-585f-4756-b3f9-6f0714ea3248"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.837698 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44d62cd1-585f-4756-b3f9-6f0714ea3248-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.861741 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data" (OuterVolumeSpecName: "config-data") pod "dd6ce9d5-263a-4b05-83e5-c349f0038001" (UID: "dd6ce9d5-263a-4b05-83e5-c349f0038001"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:04 crc kubenswrapper[4795]: I1205 08:45:04.962325 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd6ce9d5-263a-4b05-83e5-c349f0038001-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.071029 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:05 crc kubenswrapper[4795]: E1205 08:45:05.072265 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" containerName="cinder-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.080569 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" containerName="cinder-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: E1205 08:45:05.081135 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" containerName="barbican-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.088344 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" containerName="barbican-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.090257 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" containerName="cinder-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.090354 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" containerName="barbican-db-sync" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.105286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.114364 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hrknz" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.115774 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.115932 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.116110 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.168601 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xs25\" (UniqueName: \"kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173540 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.173623 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.191656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.194227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.208784 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.261683 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.270079 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.273556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288490 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288649 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98k7\" (UniqueName: \"kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288915 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.288970 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.289414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xs25\" (UniqueName: \"kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.289457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.289488 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.297546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.299305 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.312076 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.326997 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.328357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.335272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xs25\" (UniqueName: \"kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25\") pod \"cinder-scheduler-0\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxtl7\" (UniqueName: \"kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98k7\" (UniqueName: \"kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397555 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.397642 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.398957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.399506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.400266 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.401333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.404656 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.456295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.461362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98k7\" (UniqueName: \"kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7\") pod \"dnsmasq-dns-774db89647-5qk42\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxtl7\" (UniqueName: \"kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.503309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.504222 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.504250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.518354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.520748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.522373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.528460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.540580 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.547638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxtl7\" (UniqueName: \"kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7\") pod \"cinder-api-0\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " pod="openstack/cinder-api-0" Dec 05 08:45:05 crc kubenswrapper[4795]: I1205 08:45:05.617418 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.062710 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d6d5498f5-mkdbf"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.064672 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.077208 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.077450 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.077571 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-czt5p" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.081695 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6d6c47b668-nqmgd"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.083359 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.115243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.116305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651edcf-b0db-4e86-9c04-6b26df481c95-logs\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.116378 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-combined-ca-bundle\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.116456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data-custom\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.117645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.117698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x9vv\" (UniqueName: \"kubernetes.io/projected/c651edcf-b0db-4e86-9c04-6b26df481c95-kube-api-access-8x9vv\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.119013 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d6c47b668-nqmgd"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.153901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d6d5498f5-mkdbf"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227669 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x9vv\" (UniqueName: \"kubernetes.io/projected/c651edcf-b0db-4e86-9c04-6b26df481c95-kube-api-access-8x9vv\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data-custom\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651edcf-b0db-4e86-9c04-6b26df481c95-logs\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-combined-ca-bundle\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-combined-ca-bundle\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data-custom\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-logs\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.227981 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb4rc\" (UniqueName: \"kubernetes.io/projected/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-kube-api-access-fb4rc\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.229495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651edcf-b0db-4e86-9c04-6b26df481c95-logs\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.240372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.254429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-config-data-custom\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.263302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x9vv\" (UniqueName: \"kubernetes.io/projected/c651edcf-b0db-4e86-9c04-6b26df481c95-kube-api-access-8x9vv\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.267514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c651edcf-b0db-4e86-9c04-6b26df481c95-combined-ca-bundle\") pod \"barbican-worker-6d6d5498f5-mkdbf\" (UID: \"c651edcf-b0db-4e86-9c04-6b26df481c95\") " pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.312845 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.329430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4rc\" (UniqueName: \"kubernetes.io/projected/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-kube-api-access-fb4rc\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.329540 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data-custom\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.329575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.329636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-combined-ca-bundle\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.329694 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-logs\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.330092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-logs\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.345252 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data-custom\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.345803 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-combined-ca-bundle\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.353415 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-config-data\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.354707 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.356884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.367129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.374340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.393775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4rc\" (UniqueName: \"kubernetes.io/projected/726dc98e-9fe1-4b31-ba77-29d9e165b6d6-kube-api-access-fb4rc\") pod \"barbican-keystone-listener-6d6c47b668-nqmgd\" (UID: \"726dc98e-9fe1-4b31-ba77-29d9e165b6d6\") " pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.399245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.415124 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.417849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.431356 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.438115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.438287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.438342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.438426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.438476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwprm\" (UniqueName: \"kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.482997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.540962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.541570 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rdf\" (UniqueName: \"kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.547551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.547820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.547952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.547996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwprm\" (UniqueName: \"kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.548068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.566290 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.566306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.580144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwprm\" (UniqueName: \"kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.582280 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data\") pod \"barbican-api-6977767f64-7wgr9\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rdf\" (UniqueName: \"kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650587 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650747 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.650770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.653339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.653705 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.654206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.654425 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.654928 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.671801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rdf\" (UniqueName: \"kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf\") pod \"dnsmasq-dns-6578955fd5-k8zsh\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.730040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:06 crc kubenswrapper[4795]: I1205 08:45:06.789514 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:07 crc kubenswrapper[4795]: E1205 08:45:07.041426 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Dec 05 08:45:07 crc kubenswrapper[4795]: E1205 08:45:07.041713 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wcj8r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b1f0a01c-4d6c-4534-950a-699df43b935a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 05 08:45:07 crc kubenswrapper[4795]: E1205 08:45:07.046603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.096679 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.171425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config\") pod \"1720beb5-14de-4db2-9581-945e2f781500\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.172426 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs\") pod \"1720beb5-14de-4db2-9581-945e2f781500\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.172476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config\") pod \"1720beb5-14de-4db2-9581-945e2f781500\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.172535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkr2p\" (UniqueName: \"kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p\") pod \"1720beb5-14de-4db2-9581-945e2f781500\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.172814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle\") pod \"1720beb5-14de-4db2-9581-945e2f781500\" (UID: \"1720beb5-14de-4db2-9581-945e2f781500\") " Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.201234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p" (OuterVolumeSpecName: "kube-api-access-jkr2p") pod "1720beb5-14de-4db2-9581-945e2f781500" (UID: "1720beb5-14de-4db2-9581-945e2f781500"). InnerVolumeSpecName "kube-api-access-jkr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.206891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1720beb5-14de-4db2-9581-945e2f781500" (UID: "1720beb5-14de-4db2-9581-945e2f781500"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.275427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config" (OuterVolumeSpecName: "config") pod "1720beb5-14de-4db2-9581-945e2f781500" (UID: "1720beb5-14de-4db2-9581-945e2f781500"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.277247 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.277278 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.277292 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkr2p\" (UniqueName: \"kubernetes.io/projected/1720beb5-14de-4db2-9581-945e2f781500-kube-api-access-jkr2p\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.320138 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1720beb5-14de-4db2-9581-945e2f781500" (UID: "1720beb5-14de-4db2-9581-945e2f781500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.373707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1720beb5-14de-4db2-9581-945e2f781500" (UID: "1720beb5-14de-4db2-9581-945e2f781500"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.384829 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.384868 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1720beb5-14de-4db2-9581-945e2f781500-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.793365 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b"] Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.797147 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="ceilometer-notification-agent" containerID="cri-o://a15485c5ca1c12de33abea552ef3854d359f6cf021bfbc8c4f438f477a74f382" gracePeriod=30 Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.797310 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-666967b744-rtdmp" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.799364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-666967b744-rtdmp" event={"ID":"1720beb5-14de-4db2-9581-945e2f781500","Type":"ContainerDied","Data":"428e413ceb1b6f2f071e16ab6f46d2b02dc5241c355690824c040ed4c24a1634"} Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.799422 4795 scope.go:117] "RemoveContainer" containerID="12e4e7294c0ff8f1b5ed6cc6e1b6d86706e61eda5b1e625a3bcc1b4598aa4f83" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.799733 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="sg-core" containerID="cri-o://2f805fdee004ebc6b849916b9f24e479c474d5e914e39cc4e2265bb6e1100ff2" gracePeriod=30 Dec 05 08:45:07 crc kubenswrapper[4795]: W1205 08:45:07.883725 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod286e1852_50a3_4f67_9588_faf932b0d456.slice/crio-336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab WatchSource:0}: Error finding container 336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab: Status 404 returned error can't find the container with id 336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.941044 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.941138 4795 scope.go:117] "RemoveContainer" containerID="f2185aa8a63a3927e1c97ff0f58b9a0575544b153ace41bfc8af687c1f1343e8" Dec 05 08:45:07 crc kubenswrapper[4795]: I1205 08:45:07.953569 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-666967b744-rtdmp"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.252345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.283557 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.411734 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d6d5498f5-mkdbf"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.538241 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.576357 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.610369 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d6c47b668-nqmgd"] Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.623059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:45:08 crc kubenswrapper[4795]: W1205 08:45:08.667854 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc69c8e1_899b_4dc2_a28e_7aeab9c9f3b2.slice/crio-66ab8089d0897d982673c4da63a676e0455d43a90ebd4bd23ab939514472da33 WatchSource:0}: Error finding container 66ab8089d0897d982673c4da63a676e0455d43a90ebd4bd23ab939514472da33: Status 404 returned error can't find the container with id 66ab8089d0897d982673c4da63a676e0455d43a90ebd4bd23ab939514472da33 Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.833771 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1720beb5-14de-4db2-9581-945e2f781500" path="/var/lib/kubelet/pods/1720beb5-14de-4db2-9581-945e2f781500/volumes" Dec 05 08:45:08 crc kubenswrapper[4795]: I1205 08:45:08.940444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerStarted","Data":"1eec65489d3d55b51464fcb50472f451111a613c717efe229bdebcda355cc276"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.001536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" event={"ID":"286e1852-50a3-4f67-9588-faf932b0d456","Type":"ContainerStarted","Data":"9fe1328145a5c8747c67de153cf214e883013a3aea00b96b0f3c8402dff90270"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.001601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" event={"ID":"286e1852-50a3-4f67-9588-faf932b0d456","Type":"ContainerStarted","Data":"336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.053225 4795 generic.go:334] "Generic (PLEG): container finished" podID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerID="2f805fdee004ebc6b849916b9f24e479c474d5e914e39cc4e2265bb6e1100ff2" exitCode=2 Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.053352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerDied","Data":"2f805fdee004ebc6b849916b9f24e479c474d5e914e39cc4e2265bb6e1100ff2"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.102959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerStarted","Data":"0b9530a45cd984ad5a229f65ddf79e8b412ce3002e730800f4091a2fee40eae6"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.154953 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" event={"ID":"c651edcf-b0db-4e86-9c04-6b26df481c95","Type":"ContainerStarted","Data":"b9f6a6e386aa430018db5fba1b1086e1f03c720cdb310a586e3ed510ef1588ca"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.201478 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-5qk42" event={"ID":"1347f699-2df1-4e7a-8c79-ddff51fe3d7e","Type":"ContainerStarted","Data":"ac77811e0f8ae7aedb782ac8f0af0657b37f12150c5463e601f387f101ddf824"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.240483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" event={"ID":"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2","Type":"ContainerStarted","Data":"66ab8089d0897d982673c4da63a676e0455d43a90ebd4bd23ab939514472da33"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.275198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" event={"ID":"726dc98e-9fe1-4b31-ba77-29d9e165b6d6","Type":"ContainerStarted","Data":"4bfbcb39ed8fe0152d6cede427900645238f81709b56d7dc258f0f0395bfd5a7"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.312828 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerStarted","Data":"1be407f7c37d48aa6608085bfa6427170d4703ae936465056b2ac234f4351ec2"} Dec 05 08:45:09 crc kubenswrapper[4795]: I1205 08:45:09.324038 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" podStartSLOduration=9.32401224 podStartE2EDuration="9.32401224s" podCreationTimestamp="2025-12-05 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:09.260481545 +0000 UTC m=+1260.833085284" watchObservedRunningTime="2025-12-05 08:45:09.32401224 +0000 UTC m=+1260.896615979" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.033044 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.033518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.034784 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce"} pod="openstack/horizon-797f5f5996-7wlp4" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.034827 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" containerID="cri-o://ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce" gracePeriod=30 Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.328454 4795 generic.go:334] "Generic (PLEG): container finished" podID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerID="b5a017bb4279a9b4099caf917181a06cc26e7799a700a0b689eebfb01b8b0c8d" exitCode=0 Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.328562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" event={"ID":"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2","Type":"ContainerDied","Data":"b5a017bb4279a9b4099caf917181a06cc26e7799a700a0b689eebfb01b8b0c8d"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.338965 4795 generic.go:334] "Generic (PLEG): container finished" podID="286e1852-50a3-4f67-9588-faf932b0d456" containerID="9fe1328145a5c8747c67de153cf214e883013a3aea00b96b0f3c8402dff90270" exitCode=0 Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.339045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" event={"ID":"286e1852-50a3-4f67-9588-faf932b0d456","Type":"ContainerDied","Data":"9fe1328145a5c8747c67de153cf214e883013a3aea00b96b0f3c8402dff90270"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.346902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerStarted","Data":"6e060856ae3aa62bca8f3d8877b57437e11419a8776013e24b00b7480d724c70"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.346994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerStarted","Data":"404edce1ce538e5539ed675e02b355684baa7de19abf1b878fcf9b1b91429017"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.348181 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.348218 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.364379 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.364450 4795 generic.go:334] "Generic (PLEG): container finished" podID="1347f699-2df1-4e7a-8c79-ddff51fe3d7e" containerID="053fc929aa37bcb1314ce6c65dc7bb2bdb35680cf8708d3dd1896cb22f6492fd" exitCode=0 Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.364745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-5qk42" event={"ID":"1347f699-2df1-4e7a-8c79-ddff51fe3d7e","Type":"ContainerDied","Data":"053fc929aa37bcb1314ce6c65dc7bb2bdb35680cf8708d3dd1896cb22f6492fd"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.364784 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.366264 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a"} pod="openstack/horizon-57b485fdb4-h9cjs" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.366320 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" containerID="cri-o://be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a" gracePeriod=30 Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.374066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerStarted","Data":"02b4ccc417d11aae6af05b30e8e9a844cc62008701443872bb23a94423afd993"} Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.392731 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.462680 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6977767f64-7wgr9" podStartSLOduration=4.462640808 podStartE2EDuration="4.462640808s" podCreationTimestamp="2025-12-05 08:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:10.403735658 +0000 UTC m=+1261.976339387" watchObservedRunningTime="2025-12-05 08:45:10.462640808 +0000 UTC m=+1262.035244567" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.835381 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.835881 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.835948 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.836861 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:45:10 crc kubenswrapper[4795]: I1205 08:45:10.836916 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9" gracePeriod=600 Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.390078 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerStarted","Data":"66133fc296ecc3cfb5cecdb5ae91f4c254ec1bc1d485a6da20174978a722a0c9"} Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.392759 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9" exitCode=0 Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.392820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9"} Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.392854 4795 scope.go:117] "RemoveContainer" containerID="c2f6080d55cccfdc27d13b3507aa7946f9ae66b27d3649b388782040496135b5" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.396515 4795 generic.go:334] "Generic (PLEG): container finished" podID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerID="a15485c5ca1c12de33abea552ef3854d359f6cf021bfbc8c4f438f477a74f382" exitCode=0 Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.399647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerDied","Data":"a15485c5ca1c12de33abea552ef3854d359f6cf021bfbc8c4f438f477a74f382"} Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.703323 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.786767 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.790781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x98k7\" (UniqueName: \"kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.790905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.791028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.791266 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.791347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config\") pod \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\" (UID: \"1347f699-2df1-4e7a-8c79-ddff51fe3d7e\") " Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.844909 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7" (OuterVolumeSpecName: "kube-api-access-x98k7") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "kube-api-access-x98k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.900025 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x98k7\" (UniqueName: \"kubernetes.io/projected/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-kube-api-access-x98k7\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.940321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config" (OuterVolumeSpecName: "config") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.956366 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.991019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.993543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:11 crc kubenswrapper[4795]: I1205 08:45:11.995140 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1347f699-2df1-4e7a-8c79-ddff51fe3d7e" (UID: "1347f699-2df1-4e7a-8c79-ddff51fe3d7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.002464 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.002517 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.002676 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.002691 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.002702 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1347f699-2df1-4e7a-8c79-ddff51fe3d7e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.412697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774db89647-5qk42" event={"ID":"1347f699-2df1-4e7a-8c79-ddff51fe3d7e","Type":"ContainerDied","Data":"ac77811e0f8ae7aedb782ac8f0af0657b37f12150c5463e601f387f101ddf824"} Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.412844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774db89647-5qk42" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.569741 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.589986 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774db89647-5qk42"] Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.628515 4795 scope.go:117] "RemoveContainer" containerID="053fc929aa37bcb1314ce6c65dc7bb2bdb35680cf8708d3dd1896cb22f6492fd" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.780974 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.785756 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1347f699-2df1-4e7a-8c79-ddff51fe3d7e" path="/var/lib/kubelet/pods/1347f699-2df1-4e7a-8c79-ddff51fe3d7e/volumes" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.796601 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.832877 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcj8r\" (UniqueName: \"kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.832973 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833023 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74blx\" (UniqueName: \"kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx\") pod \"286e1852-50a3-4f67-9588-faf932b0d456\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833133 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume\") pod \"286e1852-50a3-4f67-9588-faf932b0d456\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833323 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data\") pod \"b1f0a01c-4d6c-4534-950a-699df43b935a\" (UID: \"b1f0a01c-4d6c-4534-950a-699df43b935a\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.833509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume\") pod \"286e1852-50a3-4f67-9588-faf932b0d456\" (UID: \"286e1852-50a3-4f67-9588-faf932b0d456\") " Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.854977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.867951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume" (OuterVolumeSpecName: "config-volume") pod "286e1852-50a3-4f67-9588-faf932b0d456" (UID: "286e1852-50a3-4f67-9588-faf932b0d456"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.870413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r" (OuterVolumeSpecName: "kube-api-access-wcj8r") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "kube-api-access-wcj8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.876236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.880781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts" (OuterVolumeSpecName: "scripts") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.909952 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx" (OuterVolumeSpecName: "kube-api-access-74blx") pod "286e1852-50a3-4f67-9588-faf932b0d456" (UID: "286e1852-50a3-4f67-9588-faf932b0d456"). InnerVolumeSpecName "kube-api-access-74blx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.917397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "286e1852-50a3-4f67-9588-faf932b0d456" (UID: "286e1852-50a3-4f67-9588-faf932b0d456"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.932826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data" (OuterVolumeSpecName: "config-data") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.936028 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943295 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943322 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/286e1852-50a3-4f67-9588-faf932b0d456-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943383 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcj8r\" (UniqueName: \"kubernetes.io/projected/b1f0a01c-4d6c-4534-950a-699df43b935a-kube-api-access-wcj8r\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943405 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74blx\" (UniqueName: \"kubernetes.io/projected/286e1852-50a3-4f67-9588-faf932b0d456-kube-api-access-74blx\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943420 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b1f0a01c-4d6c-4534-950a-699df43b935a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943432 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/286e1852-50a3-4f67-9588-faf932b0d456-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.943443 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.949695 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:12 crc kubenswrapper[4795]: I1205 08:45:12.950657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1f0a01c-4d6c-4534-950a-699df43b935a" (UID: "b1f0a01c-4d6c-4534-950a-699df43b935a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.046378 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.046432 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f0a01c-4d6c-4534-950a-699df43b935a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.446346 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.446330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b1f0a01c-4d6c-4534-950a-699df43b935a","Type":"ContainerDied","Data":"a9ca994b258a47134d9986b79c9412e8194ae0281c14550fa312f7b7a0fac6e6"} Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.446525 4795 scope.go:117] "RemoveContainer" containerID="2f805fdee004ebc6b849916b9f24e479c474d5e914e39cc4e2265bb6e1100ff2" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.493925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" event={"ID":"286e1852-50a3-4f67-9588-faf932b0d456","Type":"ContainerDied","Data":"336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab"} Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.493976 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336eba3287d1edaed15cb45e292e457703ecaa270587fd72572e5ac84b4d68ab" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.494039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.598347 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.635109 4795 scope.go:117] "RemoveContainer" containerID="a15485c5ca1c12de33abea552ef3854d359f6cf021bfbc8c4f438f477a74f382" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.645025 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.709235 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.710979 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1347f699-2df1-4e7a-8c79-ddff51fe3d7e" containerName="init" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711004 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1347f699-2df1-4e7a-8c79-ddff51fe3d7e" containerName="init" Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.711038 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="sg-core" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711045 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="sg-core" Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.711075 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-httpd" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711083 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-httpd" Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.711101 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-api" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711108 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-api" Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.711141 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286e1852-50a3-4f67-9588-faf932b0d456" containerName="collect-profiles" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711147 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="286e1852-50a3-4f67-9588-faf932b0d456" containerName="collect-profiles" Dec 05 08:45:13 crc kubenswrapper[4795]: E1205 08:45:13.711164 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="ceilometer-notification-agent" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711171 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="ceilometer-notification-agent" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711510 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="286e1852-50a3-4f67-9588-faf932b0d456" containerName="collect-profiles" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711525 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-api" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711544 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1720beb5-14de-4db2-9581-945e2f781500" containerName="neutron-httpd" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711566 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="ceilometer-notification-agent" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711585 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1347f699-2df1-4e7a-8c79-ddff51fe3d7e" containerName="init" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.711624 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" containerName="sg-core" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.748270 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.770825 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.771159 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.853665 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.911867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.911996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.912022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfz4v\" (UniqueName: \"kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.912068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.912118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.912167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:13 crc kubenswrapper[4795]: I1205 08:45:13.912196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.020242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.020317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.021498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.021531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfz4v\" (UniqueName: \"kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.021582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.021649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.021695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.022092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.022344 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.353063 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.354261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.354885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfz4v\" (UniqueName: \"kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.355367 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.356309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts\") pod \"ceilometer-0\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.437884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.511914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79bbf5b658-5cbs8"] Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.514883 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.523396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.523522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.552333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002"} Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.601371 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79bbf5b658-5cbs8"] Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.610772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" event={"ID":"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2","Type":"ContainerStarted","Data":"8db1beb533092d33bd20b6710443e54a5a52f8bf4df24d6481e3c74877db0617"} Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.611912 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.641215 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data-custom\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.641290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17050311-556c-4364-bd99-195d690178cb-logs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.641317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-combined-ca-bundle\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.641379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-public-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.642431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-internal-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.642480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.642665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfpr\" (UniqueName: \"kubernetes.io/projected/17050311-556c-4364-bd99-195d690178cb-kube-api-access-slfpr\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.681797 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" podStartSLOduration=8.68176418 podStartE2EDuration="8.68176418s" podCreationTimestamp="2025-12-05 08:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:14.670722904 +0000 UTC m=+1266.243326663" watchObservedRunningTime="2025-12-05 08:45:14.68176418 +0000 UTC m=+1266.254367919" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753426 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data-custom\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17050311-556c-4364-bd99-195d690178cb-logs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-combined-ca-bundle\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-public-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753801 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-internal-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.753917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfpr\" (UniqueName: \"kubernetes.io/projected/17050311-556c-4364-bd99-195d690178cb-kube-api-access-slfpr\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.762427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data-custom\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.763289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17050311-556c-4364-bd99-195d690178cb-logs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.774968 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-internal-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.794348 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f0a01c-4d6c-4534-950a-699df43b935a" path="/var/lib/kubelet/pods/b1f0a01c-4d6c-4534-950a-699df43b935a/volumes" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.797360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfpr\" (UniqueName: \"kubernetes.io/projected/17050311-556c-4364-bd99-195d690178cb-kube-api-access-slfpr\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.806147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-combined-ca-bundle\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.807864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-config-data\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:14 crc kubenswrapper[4795]: I1205 08:45:14.812206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17050311-556c-4364-bd99-195d690178cb-public-tls-certs\") pod \"barbican-api-79bbf5b658-5cbs8\" (UID: \"17050311-556c-4364-bd99-195d690178cb\") " pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.016775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.228463 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.626246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" event={"ID":"c651edcf-b0db-4e86-9c04-6b26df481c95","Type":"ContainerStarted","Data":"8b74042d9a8240656bb80d7b71a928a638369fac58561965afdf191e67de96b1"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.627017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" event={"ID":"c651edcf-b0db-4e86-9c04-6b26df481c95","Type":"ContainerStarted","Data":"7724c271704053668f8227c35012bcdaa3488fcdaae10b1a48723e0bee99bd7d"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.634743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerStarted","Data":"c240642067929e1806065537f8e79aca04e4405c867d67994077d44eefb3cb4d"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.634877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.634885 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api" containerID="cri-o://c240642067929e1806065537f8e79aca04e4405c867d67994077d44eefb3cb4d" gracePeriod=30 Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.634864 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api-log" containerID="cri-o://02b4ccc417d11aae6af05b30e8e9a844cc62008701443872bb23a94423afd993" gracePeriod=30 Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.639086 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" event={"ID":"726dc98e-9fe1-4b31-ba77-29d9e165b6d6","Type":"ContainerStarted","Data":"732712a82146880151f10943f39d6324960183a18637052d6ac4272c59af12f6"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.639142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" event={"ID":"726dc98e-9fe1-4b31-ba77-29d9e165b6d6","Type":"ContainerStarted","Data":"754e41994d7a5d2fd1ffcd8170b697897cba13a57c7bbf3158be22bf7df7ce96"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.645336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerStarted","Data":"8e0ef1a4ec0d0b2d70de8a0ee7b374db8eea20581724cf962c8e5b146a5981b9"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.648509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerStarted","Data":"a5e6fffa9b65f63128aec7ccb11f759c928da5ffb0543f03f5cdb0cf640767e8"} Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.652758 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d6d5498f5-mkdbf" podStartSLOduration=6.377316407 podStartE2EDuration="10.652738329s" podCreationTimestamp="2025-12-05 08:45:05 +0000 UTC" firstStartedPulling="2025-12-05 08:45:08.467239224 +0000 UTC m=+1260.039842963" lastFinishedPulling="2025-12-05 08:45:12.742661146 +0000 UTC m=+1264.315264885" observedRunningTime="2025-12-05 08:45:15.649377939 +0000 UTC m=+1267.221981678" watchObservedRunningTime="2025-12-05 08:45:15.652738329 +0000 UTC m=+1267.225342068" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.732630 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.732589372 podStartE2EDuration="10.732589372s" podCreationTimestamp="2025-12-05 08:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:15.690714418 +0000 UTC m=+1267.263318157" watchObservedRunningTime="2025-12-05 08:45:15.732589372 +0000 UTC m=+1267.305193111" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.744603 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79bbf5b658-5cbs8"] Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.775752 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6d6c47b668-nqmgd" podStartSLOduration=6.508691382 podStartE2EDuration="10.775711069s" podCreationTimestamp="2025-12-05 08:45:05 +0000 UTC" firstStartedPulling="2025-12-05 08:45:08.715805393 +0000 UTC m=+1260.288409132" lastFinishedPulling="2025-12-05 08:45:12.98282508 +0000 UTC m=+1264.555428819" observedRunningTime="2025-12-05 08:45:15.760981604 +0000 UTC m=+1267.333585343" watchObservedRunningTime="2025-12-05 08:45:15.775711069 +0000 UTC m=+1267.348314808" Dec 05 08:45:15 crc kubenswrapper[4795]: I1205 08:45:15.811635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.553261142 podStartE2EDuration="11.811586851s" podCreationTimestamp="2025-12-05 08:45:04 +0000 UTC" firstStartedPulling="2025-12-05 08:45:08.669893201 +0000 UTC m=+1260.242496940" lastFinishedPulling="2025-12-05 08:45:09.92821891 +0000 UTC m=+1261.500822649" observedRunningTime="2025-12-05 08:45:15.787985568 +0000 UTC m=+1267.360589317" watchObservedRunningTime="2025-12-05 08:45:15.811586851 +0000 UTC m=+1267.384190590" Dec 05 08:45:16 crc kubenswrapper[4795]: I1205 08:45:16.669538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bbf5b658-5cbs8" event={"ID":"17050311-556c-4364-bd99-195d690178cb","Type":"ContainerStarted","Data":"d529426dd49296c88329aac400cba32e9222b3f8d260a6f1cc42a1cdd3c64701"} Dec 05 08:45:16 crc kubenswrapper[4795]: I1205 08:45:16.670139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bbf5b658-5cbs8" event={"ID":"17050311-556c-4364-bd99-195d690178cb","Type":"ContainerStarted","Data":"34eff4d864c1bca8907e1e36a77407b87ec0c66bdf90406992070a004ad77de4"} Dec 05 08:45:16 crc kubenswrapper[4795]: I1205 08:45:16.673003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerStarted","Data":"fc07f32c437b061a4a7d9aa82a05b7b3aa5d05510d1fbbcf062d6de2fdfc45d3"} Dec 05 08:45:16 crc kubenswrapper[4795]: I1205 08:45:16.675583 4795 generic.go:334] "Generic (PLEG): container finished" podID="312bcea2-6846-48cc-a766-f047b377b2ec" containerID="02b4ccc417d11aae6af05b30e8e9a844cc62008701443872bb23a94423afd993" exitCode=143 Dec 05 08:45:16 crc kubenswrapper[4795]: I1205 08:45:16.677447 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerDied","Data":"02b4ccc417d11aae6af05b30e8e9a844cc62008701443872bb23a94423afd993"} Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.703579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79bbf5b658-5cbs8" event={"ID":"17050311-556c-4364-bd99-195d690178cb","Type":"ContainerStarted","Data":"e6e0bbd5e435b85b8c4387e33ca665e953445ea21e5b5e084be5a7eff2f280ca"} Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.704093 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.704143 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.714055 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerStarted","Data":"9ca3b31511a993507756559d65a894925e723de1356059eca396730f2a4aa1ad"} Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.721508 4795 generic.go:334] "Generic (PLEG): container finished" podID="312bcea2-6846-48cc-a766-f047b377b2ec" containerID="c240642067929e1806065537f8e79aca04e4405c867d67994077d44eefb3cb4d" exitCode=0 Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.721563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerDied","Data":"c240642067929e1806065537f8e79aca04e4405c867d67994077d44eefb3cb4d"} Dec 05 08:45:17 crc kubenswrapper[4795]: I1205 08:45:17.737789 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79bbf5b658-5cbs8" podStartSLOduration=3.737762378 podStartE2EDuration="3.737762378s" podCreationTimestamp="2025-12-05 08:45:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:17.727942394 +0000 UTC m=+1269.300546133" watchObservedRunningTime="2025-12-05 08:45:17.737762378 +0000 UTC m=+1269.310366117" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.128188 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193253 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193669 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193705 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxtl7\" (UniqueName: \"kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.193944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.194058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data\") pod \"312bcea2-6846-48cc-a766-f047b377b2ec\" (UID: \"312bcea2-6846-48cc-a766-f047b377b2ec\") " Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.197707 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.198912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs" (OuterVolumeSpecName: "logs") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.207292 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts" (OuterVolumeSpecName: "scripts") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.208973 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.218833 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7" (OuterVolumeSpecName: "kube-api-access-gxtl7") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "kube-api-access-gxtl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.253815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.288935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data" (OuterVolumeSpecName: "config-data") pod "312bcea2-6846-48cc-a766-f047b377b2ec" (UID: "312bcea2-6846-48cc-a766-f047b377b2ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297045 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/312bcea2-6846-48cc-a766-f047b377b2ec-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297090 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297101 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297111 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312bcea2-6846-48cc-a766-f047b377b2ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297124 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297134 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxtl7\" (UniqueName: \"kubernetes.io/projected/312bcea2-6846-48cc-a766-f047b377b2ec-kube-api-access-gxtl7\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.297145 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312bcea2-6846-48cc-a766-f047b377b2ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.741744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"312bcea2-6846-48cc-a766-f047b377b2ec","Type":"ContainerDied","Data":"1eec65489d3d55b51464fcb50472f451111a613c717efe229bdebcda355cc276"} Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.743874 4795 scope.go:117] "RemoveContainer" containerID="c240642067929e1806065537f8e79aca04e4405c867d67994077d44eefb3cb4d" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.742725 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.775131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerStarted","Data":"cfaf2e9d6911149cee2c3270391a20d6a36f04fa79db47fb31c8e51d124a6e9a"} Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.810009 4795 scope.go:117] "RemoveContainer" containerID="02b4ccc417d11aae6af05b30e8e9a844cc62008701443872bb23a94423afd993" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.851670 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.868443 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.893545 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:18 crc kubenswrapper[4795]: E1205 08:45:18.894060 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.894081 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api" Dec 05 08:45:18 crc kubenswrapper[4795]: E1205 08:45:18.894118 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api-log" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.894126 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api-log" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.894322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api-log" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.894344 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" containerName="cinder-api" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.895479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.902183 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.902438 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.902563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 05 08:45:18 crc kubenswrapper[4795]: I1205 08:45:18.998367 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.034850 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035012 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqgdp\" (UniqueName: \"kubernetes.io/projected/11d903b1-31af-4f63-ac26-a2bdb125af5b-kube-api-access-wqgdp\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035074 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035125 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-scripts\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d903b1-31af-4f63-ac26-a2bdb125af5b-logs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11d903b1-31af-4f63-ac26-a2bdb125af5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.035559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.137869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-scripts\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.137935 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d903b1-31af-4f63-ac26-a2bdb125af5b-logs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.137992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11d903b1-31af-4f63-ac26-a2bdb125af5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqgdp\" (UniqueName: \"kubernetes.io/projected/11d903b1-31af-4f63-ac26-a2bdb125af5b-kube-api-access-wqgdp\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.139221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.140840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/11d903b1-31af-4f63-ac26-a2bdb125af5b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.150693 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11d903b1-31af-4f63-ac26-a2bdb125af5b-logs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.158923 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-scripts\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.165206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.165797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.173576 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqgdp\" (UniqueName: \"kubernetes.io/projected/11d903b1-31af-4f63-ac26-a2bdb125af5b-kube-api-access-wqgdp\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.175310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.175916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.176005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11d903b1-31af-4f63-ac26-a2bdb125af5b-config-data-custom\") pod \"cinder-api-0\" (UID: \"11d903b1-31af-4f63-ac26-a2bdb125af5b\") " pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.250064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 05 08:45:19 crc kubenswrapper[4795]: I1205 08:45:19.862254 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.458785 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.791695 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312bcea2-6846-48cc-a766-f047b377b2ec" path="/var/lib/kubelet/pods/312bcea2-6846-48cc-a766-f047b377b2ec/volumes" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.816884 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.817080 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.817357 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.846555 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerStarted","Data":"a633280fed774af535c9ff49798f6a7ae4d3d244c21725b948835ac3119b802f"} Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.847334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.855255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"11d903b1-31af-4f63-ac26-a2bdb125af5b","Type":"ContainerStarted","Data":"95528b73d1452f75f19925b2f9f36a36396de704531ee787e6599f69443822e4"} Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.887705 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.590606642 podStartE2EDuration="7.887679255s" podCreationTimestamp="2025-12-05 08:45:13 +0000 UTC" firstStartedPulling="2025-12-05 08:45:15.275708405 +0000 UTC m=+1266.848312144" lastFinishedPulling="2025-12-05 08:45:19.572781018 +0000 UTC m=+1271.145384757" observedRunningTime="2025-12-05 08:45:20.883285946 +0000 UTC m=+1272.455889685" watchObservedRunningTime="2025-12-05 08:45:20.887679255 +0000 UTC m=+1272.460282994" Dec 05 08:45:20 crc kubenswrapper[4795]: I1205 08:45:20.949277 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.241430 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.773022 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.797000 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.920775 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.921130 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="dnsmasq-dns" containerID="cri-o://5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9" gracePeriod=10 Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.959757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"11d903b1-31af-4f63-ac26-a2bdb125af5b","Type":"ContainerStarted","Data":"10b7df07e59733ebd4ef8cd7473a51160c21241f0f464e068edb586540880583"} Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.960281 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="cinder-scheduler" containerID="cri-o://66133fc296ecc3cfb5cecdb5ae91f4c254ec1bc1d485a6da20174978a722a0c9" gracePeriod=30 Dec 05 08:45:21 crc kubenswrapper[4795]: I1205 08:45:21.961035 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="probe" containerID="cri-o://8e0ef1a4ec0d0b2d70de8a0ee7b374db8eea20581724cf962c8e5b146a5981b9" gracePeriod=30 Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.718801 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.837374 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.837908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.838026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgwnb\" (UniqueName: \"kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.838193 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.838341 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.838362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0\") pod \"19ddeb6e-4006-4919-86f7-e82748bd655f\" (UID: \"19ddeb6e-4006-4919-86f7-e82748bd655f\") " Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.867009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb" (OuterVolumeSpecName: "kube-api-access-rgwnb") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "kube-api-access-rgwnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.940419 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgwnb\" (UniqueName: \"kubernetes.io/projected/19ddeb6e-4006-4919-86f7-e82748bd655f-kube-api-access-rgwnb\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.986580 4795 generic.go:334] "Generic (PLEG): container finished" podID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerID="5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9" exitCode=0 Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.986673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" event={"ID":"19ddeb6e-4006-4919-86f7-e82748bd655f","Type":"ContainerDied","Data":"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9"} Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.986716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" event={"ID":"19ddeb6e-4006-4919-86f7-e82748bd655f","Type":"ContainerDied","Data":"ac7a25af40060a927c0191a9d7f6caf40327fe6b0db0e7d151eeba8f1682fd45"} Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.986738 4795 scope.go:117] "RemoveContainer" containerID="5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9" Dec 05 08:45:22 crc kubenswrapper[4795]: I1205 08:45:22.986970 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-6t2ng" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.207632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config" (OuterVolumeSpecName: "config") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.248483 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.252315 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.285921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.289103 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.298568 4795 scope.go:117] "RemoveContainer" containerID="a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.328211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19ddeb6e-4006-4919-86f7-e82748bd655f" (UID: "19ddeb6e-4006-4919-86f7-e82748bd655f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.336047 4795 scope.go:117] "RemoveContainer" containerID="5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9" Dec 05 08:45:23 crc kubenswrapper[4795]: E1205 08:45:23.337295 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9\": container with ID starting with 5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9 not found: ID does not exist" containerID="5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.337347 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9"} err="failed to get container status \"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9\": rpc error: code = NotFound desc = could not find container \"5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9\": container with ID starting with 5799739ee63b26ba5d76dfe94c1e3c58aea437633e8d0d1371d36d8e2dfb62d9 not found: ID does not exist" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.337383 4795 scope.go:117] "RemoveContainer" containerID="a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166" Dec 05 08:45:23 crc kubenswrapper[4795]: E1205 08:45:23.337695 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166\": container with ID starting with a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166 not found: ID does not exist" containerID="a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.337718 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166"} err="failed to get container status \"a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166\": rpc error: code = NotFound desc = could not find container \"a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166\": container with ID starting with a22756b4445516722ad6eca0bf9e2857df7057626afd0cd5371c61c8899d7166 not found: ID does not exist" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.352408 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.352772 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.352880 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.352950 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19ddeb6e-4006-4919-86f7-e82748bd655f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.623174 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:45:23 crc kubenswrapper[4795]: I1205 08:45:23.632838 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-6t2ng"] Dec 05 08:45:24 crc kubenswrapper[4795]: I1205 08:45:24.290084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"11d903b1-31af-4f63-ac26-a2bdb125af5b","Type":"ContainerStarted","Data":"225fb80d50151ffa67d075ef83ba9b4db31a791fed60099f56d9a10ea1bc8ccd"} Dec 05 08:45:24 crc kubenswrapper[4795]: I1205 08:45:24.290904 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 05 08:45:24 crc kubenswrapper[4795]: I1205 08:45:24.801852 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" path="/var/lib/kubelet/pods/19ddeb6e-4006-4919-86f7-e82748bd655f/volumes" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.309930 4795 generic.go:334] "Generic (PLEG): container finished" podID="e185529c-893e-438d-9833-97f2aa0275d1" containerID="8e0ef1a4ec0d0b2d70de8a0ee7b374db8eea20581724cf962c8e5b146a5981b9" exitCode=0 Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.309969 4795 generic.go:334] "Generic (PLEG): container finished" podID="e185529c-893e-438d-9833-97f2aa0275d1" containerID="66133fc296ecc3cfb5cecdb5ae91f4c254ec1bc1d485a6da20174978a722a0c9" exitCode=0 Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.310886 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerDied","Data":"8e0ef1a4ec0d0b2d70de8a0ee7b374db8eea20581724cf962c8e5b146a5981b9"} Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.310957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerDied","Data":"66133fc296ecc3cfb5cecdb5ae91f4c254ec1bc1d485a6da20174978a722a0c9"} Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.469101 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.512483 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.51245545 podStartE2EDuration="7.51245545s" podCreationTimestamp="2025-12-05 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:24.370532814 +0000 UTC m=+1275.943136563" watchObservedRunningTime="2025-12-05 08:45:25.51245545 +0000 UTC m=+1277.085059199" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xs25\" (UniqueName: \"kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628739 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628835 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.628974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle\") pod \"e185529c-893e-438d-9833-97f2aa0275d1\" (UID: \"e185529c-893e-438d-9833-97f2aa0275d1\") " Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.631909 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.646792 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts" (OuterVolumeSpecName: "scripts") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.648834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.672282 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25" (OuterVolumeSpecName: "kube-api-access-4xs25") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "kube-api-access-4xs25". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.733171 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xs25\" (UniqueName: \"kubernetes.io/projected/e185529c-893e-438d-9833-97f2aa0275d1-kube-api-access-4xs25\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.733236 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.733247 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e185529c-893e-438d-9833-97f2aa0275d1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.733256 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.734262 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.765052 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.836251 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.888784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data" (OuterVolumeSpecName: "config-data") pod "e185529c-893e-438d-9833-97f2aa0275d1" (UID: "e185529c-893e-438d-9833-97f2aa0275d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.901059 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.901164 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.901260 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56588789f4-7xbdx" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.901860 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:25 crc kubenswrapper[4795]: I1205 08:45:25.938772 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e185529c-893e-438d-9833-97f2aa0275d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.263860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.382890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e185529c-893e-438d-9833-97f2aa0275d1","Type":"ContainerDied","Data":"1be407f7c37d48aa6608085bfa6427170d4703ae936465056b2ac234f4351ec2"} Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.382984 4795 scope.go:117] "RemoveContainer" containerID="8e0ef1a4ec0d0b2d70de8a0ee7b374db8eea20581724cf962c8e5b146a5981b9" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.383004 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.441239 4795 scope.go:117] "RemoveContainer" containerID="66133fc296ecc3cfb5cecdb5ae91f4c254ec1bc1d485a6da20174978a722a0c9" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.486667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.541353 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.548509 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:26 crc kubenswrapper[4795]: E1205 08:45:26.549150 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="init" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549192 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="init" Dec 05 08:45:26 crc kubenswrapper[4795]: E1205 08:45:26.549222 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="probe" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549229 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="probe" Dec 05 08:45:26 crc kubenswrapper[4795]: E1205 08:45:26.549240 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="cinder-scheduler" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549265 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="cinder-scheduler" Dec 05 08:45:26 crc kubenswrapper[4795]: E1205 08:45:26.549290 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="dnsmasq-dns" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549298 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="dnsmasq-dns" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549667 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="cinder-scheduler" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549728 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e185529c-893e-438d-9833-97f2aa0275d1" containerName="probe" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.549746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ddeb6e-4006-4919-86f7-e82748bd655f" containerName="dnsmasq-dns" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.554558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.559418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585725 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn47f\" (UniqueName: \"kubernetes.io/projected/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-kube-api-access-zn47f\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.585987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.619096 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn47f\" (UniqueName: \"kubernetes.io/projected/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-kube-api-access-zn47f\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.688298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.689324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.697959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.698166 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.709309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.710152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.727559 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn47f\" (UniqueName: \"kubernetes.io/projected/f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce-kube-api-access-zn47f\") pod \"cinder-scheduler-0\" (UID: \"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce\") " pod="openstack/cinder-scheduler-0" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.762447 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e185529c-893e-438d-9833-97f2aa0275d1" path="/var/lib/kubelet/pods/e185529c-893e-438d-9833-97f2aa0275d1/volumes" Dec 05 08:45:26 crc kubenswrapper[4795]: I1205 08:45:26.909276 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 05 08:45:27 crc kubenswrapper[4795]: I1205 08:45:27.535372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 05 08:45:27 crc kubenswrapper[4795]: I1205 08:45:27.977430 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85d5d69654-vzspj" Dec 05 08:45:28 crc kubenswrapper[4795]: I1205 08:45:28.431517 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce","Type":"ContainerStarted","Data":"2049d1cfca48a92f94f682aebc658970c82458748a43b3a1d0e57e73b1f89e8c"} Dec 05 08:45:29 crc kubenswrapper[4795]: I1205 08:45:29.221364 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-79bbf5b658-5cbs8" podUID="17050311-556c-4364-bd99-195d690178cb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:29 crc kubenswrapper[4795]: I1205 08:45:29.452304 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:29 crc kubenswrapper[4795]: I1205 08:45:29.462834 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce","Type":"ContainerStarted","Data":"17c143a9888454d0d44179529b9a201c8a3b7358437544d7236ac90f415859cd"} Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.023897 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79bbf5b658-5cbs8" podUID="17050311-556c-4364-bd99-195d690178cb" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.165:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.475153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce","Type":"ContainerStarted","Data":"209b0dd901dcaf79ad3f500afe1c22647d4a3fe6eae05b7dfbe0e0f325d83a85"} Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.686543 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.686518062 podStartE2EDuration="4.686518062s" podCreationTimestamp="2025-12-05 08:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:30.509883453 +0000 UTC m=+1282.082487182" watchObservedRunningTime="2025-12-05 08:45:30.686518062 +0000 UTC m=+1282.259121801" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.704372 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.705954 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.709128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.709408 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-vh9jt" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.713447 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.722173 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.856028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.856120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config-secret\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.856338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.856491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lk4f\" (UniqueName: \"kubernetes.io/projected/61c38f36-24c0-4c36-986c-8a7552eadfbb-kube-api-access-8lk4f\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.961214 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.961285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config-secret\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.961349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.961382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lk4f\" (UniqueName: \"kubernetes.io/projected/61c38f36-24c0-4c36-986c-8a7552eadfbb-kube-api-access-8lk4f\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.962897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.968811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.969197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/61c38f36-24c0-4c36-986c-8a7552eadfbb-openstack-config-secret\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:30 crc kubenswrapper[4795]: I1205 08:45:30.987597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lk4f\" (UniqueName: \"kubernetes.io/projected/61c38f36-24c0-4c36-986c-8a7552eadfbb-kube-api-access-8lk4f\") pod \"openstackclient\" (UID: \"61c38f36-24c0-4c36-986c-8a7552eadfbb\") " pod="openstack/openstackclient" Dec 05 08:45:31 crc kubenswrapper[4795]: I1205 08:45:31.024711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 05 08:45:31 crc kubenswrapper[4795]: I1205 08:45:31.857670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 05 08:45:31 crc kubenswrapper[4795]: W1205 08:45:31.909113 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61c38f36_24c0_4c36_986c_8a7552eadfbb.slice/crio-2d933743977da5bc0b73cdcfd010910f67fb1231778903929dbc8056b899fb35 WatchSource:0}: Error finding container 2d933743977da5bc0b73cdcfd010910f67fb1231778903929dbc8056b899fb35: Status 404 returned error can't find the container with id 2d933743977da5bc0b73cdcfd010910f67fb1231778903929dbc8056b899fb35 Dec 05 08:45:31 crc kubenswrapper[4795]: I1205 08:45:31.909402 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 05 08:45:31 crc kubenswrapper[4795]: I1205 08:45:31.925070 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:45:32 crc kubenswrapper[4795]: I1205 08:45:32.502914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"61c38f36-24c0-4c36-986c-8a7552eadfbb","Type":"ContainerStarted","Data":"2d933743977da5bc0b73cdcfd010910f67fb1231778903929dbc8056b899fb35"} Dec 05 08:45:33 crc kubenswrapper[4795]: I1205 08:45:33.283899 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="11d903b1-31af-4f63-ac26-a2bdb125af5b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:34 crc kubenswrapper[4795]: I1205 08:45:34.257759 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="11d903b1-31af-4f63-ac26-a2bdb125af5b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.166:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:45:34 crc kubenswrapper[4795]: I1205 08:45:34.390249 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79bbf5b658-5cbs8" Dec 05 08:45:34 crc kubenswrapper[4795]: I1205 08:45:34.476129 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:34 crc kubenswrapper[4795]: I1205 08:45:34.476509 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" containerID="cri-o://404edce1ce538e5539ed675e02b355684baa7de19abf1b878fcf9b1b91429017" gracePeriod=30 Dec 05 08:45:34 crc kubenswrapper[4795]: I1205 08:45:34.477101 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" containerID="cri-o://6e060856ae3aa62bca8f3d8877b57437e11419a8776013e24b00b7480d724c70" gracePeriod=30 Dec 05 08:45:35 crc kubenswrapper[4795]: I1205 08:45:35.571181 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerID="404edce1ce538e5539ed675e02b355684baa7de19abf1b878fcf9b1b91429017" exitCode=143 Dec 05 08:45:35 crc kubenswrapper[4795]: I1205 08:45:35.571376 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerDied","Data":"404edce1ce538e5539ed675e02b355684baa7de19abf1b878fcf9b1b91429017"} Dec 05 08:45:37 crc kubenswrapper[4795]: I1205 08:45:37.283004 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 05 08:45:37 crc kubenswrapper[4795]: I1205 08:45:37.857521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 05 08:45:38 crc kubenswrapper[4795]: I1205 08:45:38.090627 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:35484->10.217.0.162:9311: read: connection reset by peer" Dec 05 08:45:38 crc kubenswrapper[4795]: I1205 08:45:38.091342 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6977767f64-7wgr9" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:35480->10.217.0.162:9311: read: connection reset by peer" Dec 05 08:45:38 crc kubenswrapper[4795]: I1205 08:45:38.636869 4795 generic.go:334] "Generic (PLEG): container finished" podID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerID="6e060856ae3aa62bca8f3d8877b57437e11419a8776013e24b00b7480d724c70" exitCode=0 Dec 05 08:45:38 crc kubenswrapper[4795]: I1205 08:45:38.636950 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerDied","Data":"6e060856ae3aa62bca8f3d8877b57437e11419a8776013e24b00b7480d724c70"} Dec 05 08:45:38 crc kubenswrapper[4795]: I1205 08:45:38.916227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.093014 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data\") pod \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.093109 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwprm\" (UniqueName: \"kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm\") pod \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.093149 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs\") pod \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.093188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle\") pod \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.093241 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom\") pod \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\" (UID: \"0f621e5b-0030-4a7d-9985-b65eafd8f1f7\") " Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.094537 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs" (OuterVolumeSpecName: "logs") pod "0f621e5b-0030-4a7d-9985-b65eafd8f1f7" (UID: "0f621e5b-0030-4a7d-9985-b65eafd8f1f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.102764 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm" (OuterVolumeSpecName: "kube-api-access-fwprm") pod "0f621e5b-0030-4a7d-9985-b65eafd8f1f7" (UID: "0f621e5b-0030-4a7d-9985-b65eafd8f1f7"). InnerVolumeSpecName "kube-api-access-fwprm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.104752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f621e5b-0030-4a7d-9985-b65eafd8f1f7" (UID: "0f621e5b-0030-4a7d-9985-b65eafd8f1f7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.180839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data" (OuterVolumeSpecName: "config-data") pod "0f621e5b-0030-4a7d-9985-b65eafd8f1f7" (UID: "0f621e5b-0030-4a7d-9985-b65eafd8f1f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.196159 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.196226 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwprm\" (UniqueName: \"kubernetes.io/projected/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-kube-api-access-fwprm\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.196241 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.196255 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.203059 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f621e5b-0030-4a7d-9985-b65eafd8f1f7" (UID: "0f621e5b-0030-4a7d-9985-b65eafd8f1f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.299546 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f621e5b-0030-4a7d-9985-b65eafd8f1f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.653047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6977767f64-7wgr9" event={"ID":"0f621e5b-0030-4a7d-9985-b65eafd8f1f7","Type":"ContainerDied","Data":"0b9530a45cd984ad5a229f65ddf79e8b412ce3002e730800f4091a2fee40eae6"} Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.653148 4795 scope.go:117] "RemoveContainer" containerID="6e060856ae3aa62bca8f3d8877b57437e11419a8776013e24b00b7480d724c70" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.653417 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6977767f64-7wgr9" Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.714405 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.731874 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6977767f64-7wgr9"] Dec 05 08:45:39 crc kubenswrapper[4795]: I1205 08:45:39.736289 4795 scope.go:117] "RemoveContainer" containerID="404edce1ce538e5539ed675e02b355684baa7de19abf1b878fcf9b1b91429017" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.669594 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce" exitCode=137 Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.669689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce"} Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.675726 4795 generic.go:334] "Generic (PLEG): container finished" podID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerID="be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a" exitCode=137 Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.675815 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerDied","Data":"be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a"} Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.768663 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" path="/var/lib/kubelet/pods/0f621e5b-0030-4a7d-9985-b65eafd8f1f7/volumes" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.830914 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-584d9d78f9-hwfvk"] Dec 05 08:45:40 crc kubenswrapper[4795]: E1205 08:45:40.831387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.831406 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" Dec 05 08:45:40 crc kubenswrapper[4795]: E1205 08:45:40.831436 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.831443 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.831658 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.831684 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f621e5b-0030-4a7d-9985-b65eafd8f1f7" containerName="barbican-api-log" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.832740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.855590 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.855979 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.856177 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.864712 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-584d9d78f9-hwfvk"] Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-etc-swift\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931636 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-public-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw5r\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-kube-api-access-8fw5r\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-run-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-log-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931846 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-combined-ca-bundle\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-config-data\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:40 crc kubenswrapper[4795]: I1205 08:45:40.931968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-internal-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-internal-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-etc-swift\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-public-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw5r\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-kube-api-access-8fw5r\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-run-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.033930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-log-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.034018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-combined-ca-bundle\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.034051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-config-data\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.037180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-log-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.037403 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-run-httpd\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.044161 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-combined-ca-bundle\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.044379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-etc-swift\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.045864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-internal-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.047822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-public-tls-certs\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.059038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw5r\" (UniqueName: \"kubernetes.io/projected/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-kube-api-access-8fw5r\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.062871 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78ae9e33-4a1a-4296-8b17-65c7775bd5ec-config-data\") pod \"swift-proxy-584d9d78f9-hwfvk\" (UID: \"78ae9e33-4a1a-4296-8b17-65c7775bd5ec\") " pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.157997 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.521655 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.522038 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-central-agent" containerID="cri-o://fc07f32c437b061a4a7d9aa82a05b7b3aa5d05510d1fbbcf062d6de2fdfc45d3" gracePeriod=30 Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.522229 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" containerID="cri-o://a633280fed774af535c9ff49798f6a7ae4d3d244c21725b948835ac3119b802f" gracePeriod=30 Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.522272 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="sg-core" containerID="cri-o://cfaf2e9d6911149cee2c3270391a20d6a36f04fa79db47fb31c8e51d124a6e9a" gracePeriod=30 Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.522309 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-notification-agent" containerID="cri-o://9ca3b31511a993507756559d65a894925e723de1356059eca396730f2a4aa1ad" gracePeriod=30 Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.553740 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": EOF" Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.695226 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b8c170b-2b9a-43e0-893f-26d761124563" containerID="cfaf2e9d6911149cee2c3270391a20d6a36f04fa79db47fb31c8e51d124a6e9a" exitCode=2 Dec 05 08:45:41 crc kubenswrapper[4795]: I1205 08:45:41.695307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerDied","Data":"cfaf2e9d6911149cee2c3270391a20d6a36f04fa79db47fb31c8e51d124a6e9a"} Dec 05 08:45:42 crc kubenswrapper[4795]: I1205 08:45:42.721681 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b8c170b-2b9a-43e0-893f-26d761124563" containerID="a633280fed774af535c9ff49798f6a7ae4d3d244c21725b948835ac3119b802f" exitCode=0 Dec 05 08:45:42 crc kubenswrapper[4795]: I1205 08:45:42.722002 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b8c170b-2b9a-43e0-893f-26d761124563" containerID="fc07f32c437b061a4a7d9aa82a05b7b3aa5d05510d1fbbcf062d6de2fdfc45d3" exitCode=0 Dec 05 08:45:42 crc kubenswrapper[4795]: I1205 08:45:42.722029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerDied","Data":"a633280fed774af535c9ff49798f6a7ae4d3d244c21725b948835ac3119b802f"} Dec 05 08:45:42 crc kubenswrapper[4795]: I1205 08:45:42.722061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerDied","Data":"fc07f32c437b061a4a7d9aa82a05b7b3aa5d05510d1fbbcf062d6de2fdfc45d3"} Dec 05 08:45:44 crc kubenswrapper[4795]: I1205 08:45:44.441381 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": dial tcp 10.217.0.164:3000: connect: connection refused" Dec 05 08:45:46 crc kubenswrapper[4795]: I1205 08:45:46.782062 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b8c170b-2b9a-43e0-893f-26d761124563" containerID="9ca3b31511a993507756559d65a894925e723de1356059eca396730f2a4aa1ad" exitCode=0 Dec 05 08:45:46 crc kubenswrapper[4795]: I1205 08:45:46.782237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerDied","Data":"9ca3b31511a993507756559d65a894925e723de1356059eca396730f2a4aa1ad"} Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.415850 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.424403 4795 scope.go:117] "RemoveContainer" containerID="7bba1bafb156148fcd8b7c8758fb509be8bd7485951c6f325bcba1d144733dcd" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfz4v\" (UniqueName: \"kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536602 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.536667 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd\") pod \"2b8c170b-2b9a-43e0-893f-26d761124563\" (UID: \"2b8c170b-2b9a-43e0-893f-26d761124563\") " Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.538734 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.538821 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.543218 4795 scope.go:117] "RemoveContainer" containerID="0632fdc0318a531f12e72de0b9794a8c04674111d045ebce05ad344a56b33f12" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.551512 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts" (OuterVolumeSpecName: "scripts") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.564343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v" (OuterVolumeSpecName: "kube-api-access-qfz4v") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "kube-api-access-qfz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.597869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.606588 4795 scope.go:117] "RemoveContainer" containerID="59136305fd1058486df9521531f4959e70ae74a2ca46438e74d9130abad786c2" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.639363 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.639389 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfz4v\" (UniqueName: \"kubernetes.io/projected/2b8c170b-2b9a-43e0-893f-26d761124563-kube-api-access-qfz4v\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.639400 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.639409 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.639418 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b8c170b-2b9a-43e0-893f-26d761124563-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.701518 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.744532 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.750988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data" (OuterVolumeSpecName: "config-data") pod "2b8c170b-2b9a-43e0-893f-26d761124563" (UID: "2b8c170b-2b9a-43e0-893f-26d761124563"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.816547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-584d9d78f9-hwfvk"] Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.822196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"61c38f36-24c0-4c36-986c-8a7552eadfbb","Type":"ContainerStarted","Data":"b139205d1d3f61f5e80dfd03546c5f641c41813e6d76e3e2a730c049b11f9049"} Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.827898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b8c170b-2b9a-43e0-893f-26d761124563","Type":"ContainerDied","Data":"a5e6fffa9b65f63128aec7ccb11f759c928da5ffb0543f03f5cdb0cf640767e8"} Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.827949 4795 scope.go:117] "RemoveContainer" containerID="a633280fed774af535c9ff49798f6a7ae4d3d244c21725b948835ac3119b802f" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.828276 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.838563 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575"} Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.853224 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.64929641 podStartE2EDuration="18.853202227s" podCreationTimestamp="2025-12-05 08:45:30 +0000 UTC" firstStartedPulling="2025-12-05 08:45:31.924776532 +0000 UTC m=+1283.497380271" lastFinishedPulling="2025-12-05 08:45:48.128682349 +0000 UTC m=+1299.701286088" observedRunningTime="2025-12-05 08:45:48.8514624 +0000 UTC m=+1300.424066139" watchObservedRunningTime="2025-12-05 08:45:48.853202227 +0000 UTC m=+1300.425805966" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.854529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57"} Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.868196 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b8c170b-2b9a-43e0-893f-26d761124563-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.962241 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:48 crc kubenswrapper[4795]: I1205 08:45:48.990267 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.023734 4795 scope.go:117] "RemoveContainer" containerID="cfaf2e9d6911149cee2c3270391a20d6a36f04fa79db47fb31c8e51d124a6e9a" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.063042 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:49 crc kubenswrapper[4795]: E1205 08:45:49.067253 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-notification-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.067555 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-notification-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: E1205 08:45:49.067751 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-central-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.067830 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-central-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: E1205 08:45:49.067941 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.068016 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" Dec 05 08:45:49 crc kubenswrapper[4795]: E1205 08:45:49.068117 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="sg-core" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.068173 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="sg-core" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.071080 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-notification-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.071230 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="proxy-httpd" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.071342 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="ceilometer-central-agent" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.071435 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" containerName="sg-core" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.098832 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.119718 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.120066 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.144004 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195500 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195557 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195673 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195694 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhmzf\" (UniqueName: \"kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.195768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.196469 4795 scope.go:117] "RemoveContainer" containerID="9ca3b31511a993507756559d65a894925e723de1356059eca396730f2a4aa1ad" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.232176 4795 scope.go:117] "RemoveContainer" containerID="fc07f32c437b061a4a7d9aa82a05b7b3aa5d05510d1fbbcf062d6de2fdfc45d3" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.300781 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.300847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.300889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhmzf\" (UniqueName: \"kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.300939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.300960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.301007 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.301046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.302166 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.302447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.308001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.313107 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.313558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.315279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.332251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhmzf\" (UniqueName: \"kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf\") pod \"ceilometer-0\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.505431 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.640432 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.899730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-584d9d78f9-hwfvk" event={"ID":"78ae9e33-4a1a-4296-8b17-65c7775bd5ec","Type":"ContainerStarted","Data":"17947be6ee829421c119d270bef4aaa8293147c16410f25de95a598292e462a8"} Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.900157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-584d9d78f9-hwfvk" event={"ID":"78ae9e33-4a1a-4296-8b17-65c7775bd5ec","Type":"ContainerStarted","Data":"d263b03a01d53a8080cbc0d81463b352d98e4774c0d669fa06fa5a5938583892"} Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.900172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-584d9d78f9-hwfvk" event={"ID":"78ae9e33-4a1a-4296-8b17-65c7775bd5ec","Type":"ContainerStarted","Data":"8a96ced00b6b62ca775cfa68adbaa05071e51982223f56c71c7c6b08bd585441"} Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.900219 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.900890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:49 crc kubenswrapper[4795]: I1205 08:45:49.982034 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-584d9d78f9-hwfvk" podStartSLOduration=9.982000642 podStartE2EDuration="9.982000642s" podCreationTimestamp="2025-12-05 08:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:45:49.934355278 +0000 UTC m=+1301.506959017" watchObservedRunningTime="2025-12-05 08:45:49.982000642 +0000 UTC m=+1301.554604381" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.032572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.032704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.203524 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.359190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.359257 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.767202 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8c170b-2b9a-43e0-893f-26d761124563" path="/var/lib/kubelet/pods/2b8c170b-2b9a-43e0-893f-26d761124563/volumes" Dec 05 08:45:50 crc kubenswrapper[4795]: I1205 08:45:50.910837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerStarted","Data":"ea648398e804a0894569efa31eedad545d5c793535e97bf809bdbc7498dca453"} Dec 05 08:45:51 crc kubenswrapper[4795]: I1205 08:45:51.923667 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerStarted","Data":"e03c3622e6c2701c3d767ab0038d2a9e7a5fc606632c12c0551f91d5147bc0ef"} Dec 05 08:45:51 crc kubenswrapper[4795]: I1205 08:45:51.924119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerStarted","Data":"f342314a811195e8347957d7bd25884704a2490fcf0ab179dbae28b256eb8d28"} Dec 05 08:45:52 crc kubenswrapper[4795]: I1205 08:45:52.946895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerStarted","Data":"c21f0d92ddc55d2246b7a80bdf5aaa3b60e0c5054806fc25421f033730834a3c"} Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.182314 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.272104 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-584d9d78f9-hwfvk" Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.582080 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.582533 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-httpd" containerID="cri-o://81a78d568bbd90cc1466533a14180c78a16082a4e4e8ba33b8591288f81dfc42" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.582886 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-log" containerID="cri-o://e7c2f5e493d5e22571e238563f990d6c647a80e9da152a29cf8facde3f8373f5" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.991697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerStarted","Data":"40fbd00fd40d345abc4a40b7a91ac0b2679acb18c81e418ee4ab5e07cf7be9d6"} Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.992146 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-central-agent" containerID="cri-o://f342314a811195e8347957d7bd25884704a2490fcf0ab179dbae28b256eb8d28" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.992202 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.992330 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="proxy-httpd" containerID="cri-o://40fbd00fd40d345abc4a40b7a91ac0b2679acb18c81e418ee4ab5e07cf7be9d6" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.992398 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="sg-core" containerID="cri-o://c21f0d92ddc55d2246b7a80bdf5aaa3b60e0c5054806fc25421f033730834a3c" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.992455 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-notification-agent" containerID="cri-o://e03c3622e6c2701c3d767ab0038d2a9e7a5fc606632c12c0551f91d5147bc0ef" gracePeriod=30 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.999338 4795 generic.go:334] "Generic (PLEG): container finished" podID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerID="e7c2f5e493d5e22571e238563f990d6c647a80e9da152a29cf8facde3f8373f5" exitCode=143 Dec 05 08:45:56 crc kubenswrapper[4795]: I1205 08:45:56.999754 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerDied","Data":"e7c2f5e493d5e22571e238563f990d6c647a80e9da152a29cf8facde3f8373f5"} Dec 05 08:45:57 crc kubenswrapper[4795]: I1205 08:45:57.037106 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.091528737 podStartE2EDuration="9.037071488s" podCreationTimestamp="2025-12-05 08:45:48 +0000 UTC" firstStartedPulling="2025-12-05 08:45:50.22309188 +0000 UTC m=+1301.795695609" lastFinishedPulling="2025-12-05 08:45:56.168634621 +0000 UTC m=+1307.741238360" observedRunningTime="2025-12-05 08:45:57.022985758 +0000 UTC m=+1308.595589497" watchObservedRunningTime="2025-12-05 08:45:57.037071488 +0000 UTC m=+1308.609675227" Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.014128 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerID="c21f0d92ddc55d2246b7a80bdf5aaa3b60e0c5054806fc25421f033730834a3c" exitCode=2 Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.014539 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerID="e03c3622e6c2701c3d767ab0038d2a9e7a5fc606632c12c0551f91d5147bc0ef" exitCode=0 Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.014201 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerDied","Data":"c21f0d92ddc55d2246b7a80bdf5aaa3b60e0c5054806fc25421f033730834a3c"} Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.014587 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerDied","Data":"e03c3622e6c2701c3d767ab0038d2a9e7a5fc606632c12c0551f91d5147bc0ef"} Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.562868 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.564319 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-log" containerID="cri-o://d2c608376194ede01479d346c99e7e45d06925ba63944024e044cd8d0dc920d7" gracePeriod=30 Dec 05 08:45:58 crc kubenswrapper[4795]: I1205 08:45:58.564400 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-httpd" containerID="cri-o://4b6bfe84272ec2a162ad7e434e9aaf9bb544e0f2c86fd535921bf973033ee045" gracePeriod=30 Dec 05 08:45:59 crc kubenswrapper[4795]: I1205 08:45:59.026895 4795 generic.go:334] "Generic (PLEG): container finished" podID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerID="d2c608376194ede01479d346c99e7e45d06925ba63944024e044cd8d0dc920d7" exitCode=143 Dec 05 08:45:59 crc kubenswrapper[4795]: I1205 08:45:59.026955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerDied","Data":"d2c608376194ede01479d346c99e7e45d06925ba63944024e044cd8d0dc920d7"} Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.035344 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.360239 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.514516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-x7v2r"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.517065 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.535038 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x7v2r"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.603062 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.603507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52wk8\" (UniqueName: \"kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.615070 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gfv9h"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.616914 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.648108 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gfv9h"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.658163 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": dial tcp 10.217.0.149:9292: connect: connection refused" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.658188 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": dial tcp 10.217.0.149:9292: connect: connection refused" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.705648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzd2w\" (UniqueName: \"kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.705763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.705854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.705876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52wk8\" (UniqueName: \"kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.707201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.748072 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52wk8\" (UniqueName: \"kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8\") pod \"nova-api-db-create-x7v2r\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.809360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.809646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzd2w\" (UniqueName: \"kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.810305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.822954 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-2nlv6"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.824641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.838094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzd2w\" (UniqueName: \"kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w\") pod \"nova-cell0-db-create-gfv9h\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.843711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.847734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ed0a-account-create-update-j8w29"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.849301 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.862129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.900213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2nlv6"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.912665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tq4s\" (UniqueName: \"kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.915306 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.931568 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ed0a-account-create-update-j8w29"] Dec 05 08:46:00 crc kubenswrapper[4795]: I1205 08:46:00.939035 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.020901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28nfv\" (UniqueName: \"kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.021029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.021084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tq4s\" (UniqueName: \"kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.021193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.027408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.076499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tq4s\" (UniqueName: \"kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s\") pod \"nova-cell1-db-create-2nlv6\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.099114 4795 generic.go:334] "Generic (PLEG): container finished" podID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerID="81a78d568bbd90cc1466533a14180c78a16082a4e4e8ba33b8591288f81dfc42" exitCode=0 Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.099464 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerDied","Data":"81a78d568bbd90cc1466533a14180c78a16082a4e4e8ba33b8591288f81dfc42"} Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.123554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28nfv\" (UniqueName: \"kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.123784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.125037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.187433 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28nfv\" (UniqueName: \"kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv\") pod \"nova-api-ed0a-account-create-update-j8w29\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.198342 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4e84-account-create-update-tqw8b"] Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.200687 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.221558 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e84-account-create-update-tqw8b"] Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.230064 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.253356 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.285180 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.300322 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3218-account-create-update-8lz4k"] Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.316870 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.327204 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.347371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj8dv\" (UniqueName: \"kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.347934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.350542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3218-account-create-update-8lz4k"] Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.450861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj8dv\" (UniqueName: \"kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.450922 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.450961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-488xj\" (UniqueName: \"kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.450980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.452462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.527678 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj8dv\" (UniqueName: \"kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv\") pod \"nova-cell0-4e84-account-create-update-tqw8b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.556389 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.556463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-488xj\" (UniqueName: \"kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.562661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.589770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.643854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-488xj\" (UniqueName: \"kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj\") pod \"nova-cell1-3218-account-create-update-8lz4k\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.652508 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.755188 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765762 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765788 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.765872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tfc4\" (UniqueName: \"kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4\") pod \"e94043a9-32bf-4842-ad1f-583d7bb8b933\" (UID: \"e94043a9-32bf-4842-ad1f-583d7bb8b933\") " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.770093 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs" (OuterVolumeSpecName: "logs") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.780855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.790915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts" (OuterVolumeSpecName: "scripts") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.791044 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.806125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4" (OuterVolumeSpecName: "kube-api-access-6tfc4") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "kube-api-access-6tfc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.871583 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.871640 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.871651 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.871660 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e94043a9-32bf-4842-ad1f-583d7bb8b933-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.871669 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tfc4\" (UniqueName: \"kubernetes.io/projected/e94043a9-32bf-4842-ad1f-583d7bb8b933-kube-api-access-6tfc4\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.943827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x7v2r"] Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.965884 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.975601 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:01 crc kubenswrapper[4795]: I1205 08:46:01.979557 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.002181 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gfv9h"] Dec 05 08:46:02 crc kubenswrapper[4795]: W1205 08:46:02.014553 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dc46818_e0ca_4331_9f50_f01e5f50d812.slice/crio-8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f WatchSource:0}: Error finding container 8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f: Status 404 returned error can't find the container with id 8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.019120 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data" (OuterVolumeSpecName: "config-data") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.055078 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:47382->10.217.0.148:9292: read: connection reset by peer" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.055517 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:47398->10.217.0.148:9292: read: connection reset by peer" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.087598 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.087651 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.103262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e94043a9-32bf-4842-ad1f-583d7bb8b933" (UID: "e94043a9-32bf-4842-ad1f-583d7bb8b933"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.158288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gfv9h" event={"ID":"33dfb11f-a33f-463e-ae7a-8f3891042c4d","Type":"ContainerStarted","Data":"0a3a6cd4c51f97e0f88f872b4283458891cf4d0ba805777ab6cd9c43d02e8cbf"} Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.219635 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94043a9-32bf-4842-ad1f-583d7bb8b933-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.226288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e94043a9-32bf-4842-ad1f-583d7bb8b933","Type":"ContainerDied","Data":"571da090e601e3d2e50598dad2aa0efd39a2de925f2b4138736154be418c654a"} Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.226351 4795 scope.go:117] "RemoveContainer" containerID="81a78d568bbd90cc1466533a14180c78a16082a4e4e8ba33b8591288f81dfc42" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.226599 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.304977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7v2r" event={"ID":"6dc46818-e0ca-4331-9f50-f01e5f50d812","Type":"ContainerStarted","Data":"8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f"} Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.386731 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2nlv6"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.434584 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.441036 4795 scope.go:117] "RemoveContainer" containerID="e7c2f5e493d5e22571e238563f990d6c647a80e9da152a29cf8facde3f8373f5" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.515660 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.515786 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:46:02 crc kubenswrapper[4795]: E1205 08:46:02.516407 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-httpd" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.516425 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-httpd" Dec 05 08:46:02 crc kubenswrapper[4795]: E1205 08:46:02.516453 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-log" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.516459 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-log" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.516901 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-log" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.517003 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" containerName="glance-httpd" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.518373 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.542719 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.543010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.574822 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.628809 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-logs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.628910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.628940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.628997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.629019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.629065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.629105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.629150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xrdd\" (UniqueName: \"kubernetes.io/projected/8f3d4d78-a614-4724-ba42-bc6f0a44be83-kube-api-access-6xrdd\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.658833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e84-account-create-update-tqw8b"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.690234 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ed0a-account-create-update-j8w29"] Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.742770 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.742840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.742901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.742917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.742957 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.743000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.743053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xrdd\" (UniqueName: \"kubernetes.io/projected/8f3d4d78-a614-4724-ba42-bc6f0a44be83-kube-api-access-6xrdd\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.743169 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-logs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.743904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-logs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.755278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.756681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.757132 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.758814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f3d4d78-a614-4724-ba42-bc6f0a44be83-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.764748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.780216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f3d4d78-a614-4724-ba42-bc6f0a44be83-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.805953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xrdd\" (UniqueName: \"kubernetes.io/projected/8f3d4d78-a614-4724-ba42-bc6f0a44be83-kube-api-access-6xrdd\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.826083 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94043a9-32bf-4842-ad1f-583d7bb8b933" path="/var/lib/kubelet/pods/e94043a9-32bf-4842-ad1f-583d7bb8b933/volumes" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.911536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"8f3d4d78-a614-4724-ba42-bc6f0a44be83\") " pod="openstack/glance-default-external-api-0" Dec 05 08:46:02 crc kubenswrapper[4795]: I1205 08:46:02.927133 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3218-account-create-update-8lz4k"] Dec 05 08:46:03 crc kubenswrapper[4795]: W1205 08:46:03.011868 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc97250f7_6ac0_48ea_897a_741b1bc97c1d.slice/crio-2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071 WatchSource:0}: Error finding container 2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071: Status 404 returned error can't find the container with id 2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071 Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.140975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.421207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ed0a-account-create-update-j8w29" event={"ID":"689df0bf-6323-4159-8433-8d916f33abff","Type":"ContainerStarted","Data":"a36e20ad03cf42e3592125df384d8dfa950f609164ae084951c7aca45fa82b56"} Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.469453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" event={"ID":"c97250f7-6ac0-48ea-897a-741b1bc97c1d","Type":"ContainerStarted","Data":"2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071"} Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.485854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" event={"ID":"a704000f-7677-42e5-86cd-4cbc8134785b","Type":"ContainerStarted","Data":"c248d4120e0941355a0bf18db7c71f4817cd9e811009d795aa9a1871674b5a59"} Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.550526 4795 generic.go:334] "Generic (PLEG): container finished" podID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerID="4b6bfe84272ec2a162ad7e434e9aaf9bb544e0f2c86fd535921bf973033ee045" exitCode=0 Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.550582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerDied","Data":"4b6bfe84272ec2a162ad7e434e9aaf9bb544e0f2c86fd535921bf973033ee045"} Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.555120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2nlv6" event={"ID":"415c9625-8f52-41fc-818f-420b59863110","Type":"ContainerStarted","Data":"eeac53a2c7227503258c56bb5a266cc34d7823e6cc28c51cd35eecb3a45fae13"} Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.600894 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-2nlv6" podStartSLOduration=3.600866783 podStartE2EDuration="3.600866783s" podCreationTimestamp="2025-12-05 08:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:46:03.587917555 +0000 UTC m=+1315.160521294" watchObservedRunningTime="2025-12-05 08:46:03.600866783 +0000 UTC m=+1315.173470522" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.813091 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940457 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v4ng\" (UniqueName: \"kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940550 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940627 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940759 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.940858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run\") pod \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\" (UID: \"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010\") " Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.943861 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.944179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs" (OuterVolumeSpecName: "logs") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.964317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts" (OuterVolumeSpecName: "scripts") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.972108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng" (OuterVolumeSpecName: "kube-api-access-2v4ng") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "kube-api-access-2v4ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:03 crc kubenswrapper[4795]: I1205 08:46:03.992360 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.008504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v4ng\" (UniqueName: \"kubernetes.io/projected/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-kube-api-access-2v4ng\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045230 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045240 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045250 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045259 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.045292 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.225394 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.256113 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.326750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data" (OuterVolumeSpecName: "config-data") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.340671 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" (UID: "606a3ccf-b193-4ec6-b7a5-65a6d8e2c010"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.358286 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.358322 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.552393 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 05 08:46:04 crc kubenswrapper[4795]: W1205 08:46:04.573452 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f3d4d78_a614_4724_ba42_bc6f0a44be83.slice/crio-e7225eb11c6f9a0e30dffe9118cd1546d4cc7e8af396bcdda2e61db22d513920 WatchSource:0}: Error finding container e7225eb11c6f9a0e30dffe9118cd1546d4cc7e8af396bcdda2e61db22d513920: Status 404 returned error can't find the container with id e7225eb11c6f9a0e30dffe9118cd1546d4cc7e8af396bcdda2e61db22d513920 Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.575900 4795 generic.go:334] "Generic (PLEG): container finished" podID="33dfb11f-a33f-463e-ae7a-8f3891042c4d" containerID="025a3d8731d45f13171e50e3bdda0242427ad278829f9a0635d5732d3fcb88d7" exitCode=0 Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.575973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gfv9h" event={"ID":"33dfb11f-a33f-463e-ae7a-8f3891042c4d","Type":"ContainerDied","Data":"025a3d8731d45f13171e50e3bdda0242427ad278829f9a0635d5732d3fcb88d7"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.585970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"606a3ccf-b193-4ec6-b7a5-65a6d8e2c010","Type":"ContainerDied","Data":"bfeb06ca5f52c1d93ed4a19e827c425fd32bb5657d8469e4f506835009eb5f0b"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.586373 4795 scope.go:117] "RemoveContainer" containerID="4b6bfe84272ec2a162ad7e434e9aaf9bb544e0f2c86fd535921bf973033ee045" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.587712 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.608469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2nlv6" event={"ID":"415c9625-8f52-41fc-818f-420b59863110","Type":"ContainerStarted","Data":"0afe21a5272589853b1a8d25b0d25fb526f7b510a918cef6cfd1dd053c5bef09"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.635847 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" event={"ID":"c97250f7-6ac0-48ea-897a-741b1bc97c1d","Type":"ContainerStarted","Data":"34e91df4f29c5b1464a26a71df42630a156b939dccb1c4734ee5530c32fb8c7d"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.650972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" event={"ID":"a704000f-7677-42e5-86cd-4cbc8134785b","Type":"ContainerStarted","Data":"92bd802f405429443aba150f3e2c90044b94f1db8e464eb4ec38757edcbbdef8"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.659540 4795 generic.go:334] "Generic (PLEG): container finished" podID="6dc46818-e0ca-4331-9f50-f01e5f50d812" containerID="1a8638debddde0b27d190e26595fa1579fb342e98510f1dd133bc83ea82b0b0b" exitCode=0 Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.659604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7v2r" event={"ID":"6dc46818-e0ca-4331-9f50-f01e5f50d812","Type":"ContainerDied","Data":"1a8638debddde0b27d190e26595fa1579fb342e98510f1dd133bc83ea82b0b0b"} Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.666521 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" podStartSLOduration=3.666501296 podStartE2EDuration="3.666501296s" podCreationTimestamp="2025-12-05 08:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:46:04.658277214 +0000 UTC m=+1316.230880973" watchObservedRunningTime="2025-12-05 08:46:04.666501296 +0000 UTC m=+1316.239105045" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.719442 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" podStartSLOduration=3.719411241 podStartE2EDuration="3.719411241s" podCreationTimestamp="2025-12-05 08:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:46:04.699980378 +0000 UTC m=+1316.272584117" watchObservedRunningTime="2025-12-05 08:46:04.719411241 +0000 UTC m=+1316.292014980" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.925556 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.948992 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.954187 4795 scope.go:117] "RemoveContainer" containerID="d2c608376194ede01479d346c99e7e45d06925ba63944024e044cd8d0dc920d7" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.970028 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:46:04 crc kubenswrapper[4795]: E1205 08:46:04.970566 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-httpd" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.970585 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-httpd" Dec 05 08:46:04 crc kubenswrapper[4795]: E1205 08:46:04.970599 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-log" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.970622 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-log" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.970836 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-httpd" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.970860 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" containerName="glance-log" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.972004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.979178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.979454 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 05 08:46:04 crc kubenswrapper[4795]: I1205 08:46:04.981666 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083332 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-logs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.083756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4kg\" (UniqueName: \"kubernetes.io/projected/acfa2531-9c7e-4017-b32d-3a4b07038cca-kube-api-access-ll4kg\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-logs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185534 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185559 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4kg\" (UniqueName: \"kubernetes.io/projected/acfa2531-9c7e-4017-b32d-3a4b07038cca-kube-api-access-ll4kg\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.185905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.187799 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.188074 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-logs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.188449 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/acfa2531-9c7e-4017-b32d-3a4b07038cca-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.196750 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.198157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.202568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-scripts\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.206907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfa2531-9c7e-4017-b32d-3a4b07038cca-config-data\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.209668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4kg\" (UniqueName: \"kubernetes.io/projected/acfa2531-9c7e-4017-b32d-3a4b07038cca-kube-api-access-ll4kg\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.286496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-0\" (UID: \"acfa2531-9c7e-4017-b32d-3a4b07038cca\") " pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.317747 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.691292 4795 generic.go:334] "Generic (PLEG): container finished" podID="415c9625-8f52-41fc-818f-420b59863110" containerID="0afe21a5272589853b1a8d25b0d25fb526f7b510a918cef6cfd1dd053c5bef09" exitCode=0 Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.691836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2nlv6" event={"ID":"415c9625-8f52-41fc-818f-420b59863110","Type":"ContainerDied","Data":"0afe21a5272589853b1a8d25b0d25fb526f7b510a918cef6cfd1dd053c5bef09"} Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.708106 4795 generic.go:334] "Generic (PLEG): container finished" podID="689df0bf-6323-4159-8433-8d916f33abff" containerID="b764a5be5f069078ea70c0e1d1e84b6c85c7d4d02fe019d1a8b8c4a27fc10d6f" exitCode=0 Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.708194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ed0a-account-create-update-j8w29" event={"ID":"689df0bf-6323-4159-8433-8d916f33abff","Type":"ContainerDied","Data":"b764a5be5f069078ea70c0e1d1e84b6c85c7d4d02fe019d1a8b8c4a27fc10d6f"} Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.715005 4795 generic.go:334] "Generic (PLEG): container finished" podID="c97250f7-6ac0-48ea-897a-741b1bc97c1d" containerID="34e91df4f29c5b1464a26a71df42630a156b939dccb1c4734ee5530c32fb8c7d" exitCode=0 Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.715098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" event={"ID":"c97250f7-6ac0-48ea-897a-741b1bc97c1d","Type":"ContainerDied","Data":"34e91df4f29c5b1464a26a71df42630a156b939dccb1c4734ee5530c32fb8c7d"} Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.725007 4795 generic.go:334] "Generic (PLEG): container finished" podID="a704000f-7677-42e5-86cd-4cbc8134785b" containerID="92bd802f405429443aba150f3e2c90044b94f1db8e464eb4ec38757edcbbdef8" exitCode=0 Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.725455 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" event={"ID":"a704000f-7677-42e5-86cd-4cbc8134785b","Type":"ContainerDied","Data":"92bd802f405429443aba150f3e2c90044b94f1db8e464eb4ec38757edcbbdef8"} Dec 05 08:46:05 crc kubenswrapper[4795]: I1205 08:46:05.744045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f3d4d78-a614-4724-ba42-bc6f0a44be83","Type":"ContainerStarted","Data":"e7225eb11c6f9a0e30dffe9118cd1546d4cc7e8af396bcdda2e61db22d513920"} Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.320247 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.498503 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.574652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzd2w\" (UniqueName: \"kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w\") pod \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.574844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts\") pod \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\" (UID: \"33dfb11f-a33f-463e-ae7a-8f3891042c4d\") " Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.591519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33dfb11f-a33f-463e-ae7a-8f3891042c4d" (UID: "33dfb11f-a33f-463e-ae7a-8f3891042c4d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.605907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w" (OuterVolumeSpecName: "kube-api-access-mzd2w") pod "33dfb11f-a33f-463e-ae7a-8f3891042c4d" (UID: "33dfb11f-a33f-463e-ae7a-8f3891042c4d"). InnerVolumeSpecName "kube-api-access-mzd2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.686234 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzd2w\" (UniqueName: \"kubernetes.io/projected/33dfb11f-a33f-463e-ae7a-8f3891042c4d-kube-api-access-mzd2w\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.686299 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33dfb11f-a33f-463e-ae7a-8f3891042c4d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.688928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.792879 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts\") pod \"6dc46818-e0ca-4331-9f50-f01e5f50d812\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.801493 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6dc46818-e0ca-4331-9f50-f01e5f50d812" (UID: "6dc46818-e0ca-4331-9f50-f01e5f50d812"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.802668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52wk8\" (UniqueName: \"kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8\") pod \"6dc46818-e0ca-4331-9f50-f01e5f50d812\" (UID: \"6dc46818-e0ca-4331-9f50-f01e5f50d812\") " Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.803393 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dc46818-e0ca-4331-9f50-f01e5f50d812-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.820482 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8" (OuterVolumeSpecName: "kube-api-access-52wk8") pod "6dc46818-e0ca-4331-9f50-f01e5f50d812" (UID: "6dc46818-e0ca-4331-9f50-f01e5f50d812"). InnerVolumeSpecName "kube-api-access-52wk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.837399 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="606a3ccf-b193-4ec6-b7a5-65a6d8e2c010" path="/var/lib/kubelet/pods/606a3ccf-b193-4ec6-b7a5-65a6d8e2c010/volumes" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.924128 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52wk8\" (UniqueName: \"kubernetes.io/projected/6dc46818-e0ca-4331-9f50-f01e5f50d812-kube-api-access-52wk8\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.996990 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x7v2r" Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.997794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x7v2r" event={"ID":"6dc46818-e0ca-4331-9f50-f01e5f50d812","Type":"ContainerDied","Data":"8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f"} Dec 05 08:46:06 crc kubenswrapper[4795]: I1205 08:46:06.997858 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8edede84df7040ae4ff954edd63194939912971794369012b01e1bc73ca3172f" Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.009820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f3d4d78-a614-4724-ba42-bc6f0a44be83","Type":"ContainerStarted","Data":"145c5b127648d4288280532f34e42a15a47a054027ea7fcbbfe62f51b3d3a38f"} Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.017763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gfv9h" event={"ID":"33dfb11f-a33f-463e-ae7a-8f3891042c4d","Type":"ContainerDied","Data":"0a3a6cd4c51f97e0f88f872b4283458891cf4d0ba805777ab6cd9c43d02e8cbf"} Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.017815 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a3a6cd4c51f97e0f88f872b4283458891cf4d0ba805777ab6cd9c43d02e8cbf" Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.017893 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gfv9h" Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.036208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"acfa2531-9c7e-4017-b32d-3a4b07038cca","Type":"ContainerStarted","Data":"010f9b9dd3bb6c67b3c40f04fcc5dcb9b62c58fae833333945c1d5ee35e24582"} Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.797524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.966050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts\") pod \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.966209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-488xj\" (UniqueName: \"kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj\") pod \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\" (UID: \"c97250f7-6ac0-48ea-897a-741b1bc97c1d\") " Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.967443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c97250f7-6ac0-48ea-897a-741b1bc97c1d" (UID: "c97250f7-6ac0-48ea-897a-741b1bc97c1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:07 crc kubenswrapper[4795]: I1205 08:46:07.981074 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj" (OuterVolumeSpecName: "kube-api-access-488xj") pod "c97250f7-6ac0-48ea-897a-741b1bc97c1d" (UID: "c97250f7-6ac0-48ea-897a-741b1bc97c1d"). InnerVolumeSpecName "kube-api-access-488xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.070733 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-488xj\" (UniqueName: \"kubernetes.io/projected/c97250f7-6ac0-48ea-897a-741b1bc97c1d-kube-api-access-488xj\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.071482 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c97250f7-6ac0-48ea-897a-741b1bc97c1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.090356 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.093649 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.096489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ed0a-account-create-update-j8w29" event={"ID":"689df0bf-6323-4159-8433-8d916f33abff","Type":"ContainerDied","Data":"a36e20ad03cf42e3592125df384d8dfa950f609164ae084951c7aca45fa82b56"} Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.096528 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a36e20ad03cf42e3592125df384d8dfa950f609164ae084951c7aca45fa82b56" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.099024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" event={"ID":"c97250f7-6ac0-48ea-897a-741b1bc97c1d","Type":"ContainerDied","Data":"2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071"} Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.099089 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c3fc54c2e6240611967784f2c95086a7a5b1595fc9cb774aca19045a2fef071" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.099056 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3218-account-create-update-8lz4k" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.101794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.102672 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8f3d4d78-a614-4724-ba42-bc6f0a44be83","Type":"ContainerStarted","Data":"3ffe69e9adf3e0d58852bed09e99964144f249d733c929efae466d7e6218f286"} Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.218626 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.218589876 podStartE2EDuration="6.218589876s" podCreationTimestamp="2025-12-05 08:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:46:08.199598874 +0000 UTC m=+1319.772202613" watchObservedRunningTime="2025-12-05 08:46:08.218589876 +0000 UTC m=+1319.791193615" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.298880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts\") pod \"a704000f-7677-42e5-86cd-4cbc8134785b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.298994 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts\") pod \"689df0bf-6323-4159-8433-8d916f33abff\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.299103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj8dv\" (UniqueName: \"kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv\") pod \"a704000f-7677-42e5-86cd-4cbc8134785b\" (UID: \"a704000f-7677-42e5-86cd-4cbc8134785b\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.299137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts\") pod \"415c9625-8f52-41fc-818f-420b59863110\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.299238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tq4s\" (UniqueName: \"kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s\") pod \"415c9625-8f52-41fc-818f-420b59863110\" (UID: \"415c9625-8f52-41fc-818f-420b59863110\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.299287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28nfv\" (UniqueName: \"kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv\") pod \"689df0bf-6323-4159-8433-8d916f33abff\" (UID: \"689df0bf-6323-4159-8433-8d916f33abff\") " Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.315543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a704000f-7677-42e5-86cd-4cbc8134785b" (UID: "a704000f-7677-42e5-86cd-4cbc8134785b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.323177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "689df0bf-6323-4159-8433-8d916f33abff" (UID: "689df0bf-6323-4159-8433-8d916f33abff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.348652 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "415c9625-8f52-41fc-818f-420b59863110" (UID: "415c9625-8f52-41fc-818f-420b59863110"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.351405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv" (OuterVolumeSpecName: "kube-api-access-28nfv") pod "689df0bf-6323-4159-8433-8d916f33abff" (UID: "689df0bf-6323-4159-8433-8d916f33abff"). InnerVolumeSpecName "kube-api-access-28nfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.356592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s" (OuterVolumeSpecName: "kube-api-access-7tq4s") pod "415c9625-8f52-41fc-818f-420b59863110" (UID: "415c9625-8f52-41fc-818f-420b59863110"). InnerVolumeSpecName "kube-api-access-7tq4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.401914 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv" (OuterVolumeSpecName: "kube-api-access-sj8dv") pod "a704000f-7677-42e5-86cd-4cbc8134785b" (UID: "a704000f-7677-42e5-86cd-4cbc8134785b"). InnerVolumeSpecName "kube-api-access-sj8dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403713 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tq4s\" (UniqueName: \"kubernetes.io/projected/415c9625-8f52-41fc-818f-420b59863110-kube-api-access-7tq4s\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403758 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28nfv\" (UniqueName: \"kubernetes.io/projected/689df0bf-6323-4159-8433-8d916f33abff-kube-api-access-28nfv\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403770 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a704000f-7677-42e5-86cd-4cbc8134785b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403780 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/689df0bf-6323-4159-8433-8d916f33abff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403788 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj8dv\" (UniqueName: \"kubernetes.io/projected/a704000f-7677-42e5-86cd-4cbc8134785b-kube-api-access-sj8dv\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:08 crc kubenswrapper[4795]: I1205 08:46:08.403797 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/415c9625-8f52-41fc-818f-420b59863110-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.135290 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"acfa2531-9c7e-4017-b32d-3a4b07038cca","Type":"ContainerStarted","Data":"a0f7e7b595d1d91144c8f84eb3841d701f9f91a08711ae750d88f2b2612e5ced"} Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.142838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2nlv6" event={"ID":"415c9625-8f52-41fc-818f-420b59863110","Type":"ContainerDied","Data":"eeac53a2c7227503258c56bb5a266cc34d7823e6cc28c51cd35eecb3a45fae13"} Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.142899 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeac53a2c7227503258c56bb5a266cc34d7823e6cc28c51cd35eecb3a45fae13" Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.142995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2nlv6" Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.151793 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.152374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e84-account-create-update-tqw8b" event={"ID":"a704000f-7677-42e5-86cd-4cbc8134785b","Type":"ContainerDied","Data":"c248d4120e0941355a0bf18db7c71f4817cd9e811009d795aa9a1871674b5a59"} Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.152402 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c248d4120e0941355a0bf18db7c71f4817cd9e811009d795aa9a1871674b5a59" Dec 05 08:46:09 crc kubenswrapper[4795]: I1205 08:46:09.152456 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ed0a-account-create-update-j8w29" Dec 05 08:46:10 crc kubenswrapper[4795]: I1205 08:46:10.034693 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:46:10 crc kubenswrapper[4795]: I1205 08:46:10.169935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"acfa2531-9c7e-4017-b32d-3a4b07038cca","Type":"ContainerStarted","Data":"6b77d296678c5ce221b7aed0358e05e5b35a943cad9b3e58ecf038320b34695c"} Dec 05 08:46:10 crc kubenswrapper[4795]: I1205 08:46:10.209296 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.209265291 podStartE2EDuration="6.209265291s" podCreationTimestamp="2025-12-05 08:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:46:10.191716258 +0000 UTC m=+1321.764319997" watchObservedRunningTime="2025-12-05 08:46:10.209265291 +0000 UTC m=+1321.781869030" Dec 05 08:46:10 crc kubenswrapper[4795]: I1205 08:46:10.361146 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.182521 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerID="f342314a811195e8347957d7bd25884704a2490fcf0ab179dbae28b256eb8d28" exitCode=0 Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.182587 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerDied","Data":"f342314a811195e8347957d7bd25884704a2490fcf0ab179dbae28b256eb8d28"} Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.603560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-js4j6"] Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604146 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689df0bf-6323-4159-8433-8d916f33abff" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604172 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="689df0bf-6323-4159-8433-8d916f33abff" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604211 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc46818-e0ca-4331-9f50-f01e5f50d812" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604219 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc46818-e0ca-4331-9f50-f01e5f50d812" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604238 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415c9625-8f52-41fc-818f-420b59863110" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604245 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="415c9625-8f52-41fc-818f-420b59863110" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604252 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97250f7-6ac0-48ea-897a-741b1bc97c1d" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604258 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97250f7-6ac0-48ea-897a-741b1bc97c1d" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604279 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a704000f-7677-42e5-86cd-4cbc8134785b" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604287 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a704000f-7677-42e5-86cd-4cbc8134785b" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: E1205 08:46:11.604298 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33dfb11f-a33f-463e-ae7a-8f3891042c4d" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604305 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33dfb11f-a33f-463e-ae7a-8f3891042c4d" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604492 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a704000f-7677-42e5-86cd-4cbc8134785b" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604504 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc46818-e0ca-4331-9f50-f01e5f50d812" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604516 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="33dfb11f-a33f-463e-ae7a-8f3891042c4d" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604530 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97250f7-6ac0-48ea-897a-741b1bc97c1d" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604540 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="689df0bf-6323-4159-8433-8d916f33abff" containerName="mariadb-account-create-update" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.604552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="415c9625-8f52-41fc-818f-420b59863110" containerName="mariadb-database-create" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.605362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.613412 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.613641 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x7wcp" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.614023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.630278 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-js4j6"] Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.732668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.732802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.732908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.732953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8v4p\" (UniqueName: \"kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.835118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.835296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.835335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8v4p\" (UniqueName: \"kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.835426 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.844342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.845080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.853237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.864716 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8v4p\" (UniqueName: \"kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p\") pod \"nova-cell0-conductor-db-sync-js4j6\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:11 crc kubenswrapper[4795]: I1205 08:46:11.930161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:46:12 crc kubenswrapper[4795]: I1205 08:46:12.650922 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-js4j6"] Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.143800 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.143877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.205412 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.219921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-js4j6" event={"ID":"72fc3705-c3fb-494a-8dc3-9949853a7c1a","Type":"ContainerStarted","Data":"b071520c637fe712012e524ee8a7f4f16e40093b9c8a472a41bdeed65801f4c4"} Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.220559 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 08:46:13 crc kubenswrapper[4795]: I1205 08:46:13.250382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 05 08:46:14 crc kubenswrapper[4795]: I1205 08:46:14.240906 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 05 08:46:15 crc kubenswrapper[4795]: I1205 08:46:15.247424 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:46:15 crc kubenswrapper[4795]: I1205 08:46:15.319406 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:15 crc kubenswrapper[4795]: I1205 08:46:15.319463 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:15 crc kubenswrapper[4795]: I1205 08:46:15.413046 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:15 crc kubenswrapper[4795]: I1205 08:46:15.414248 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:16 crc kubenswrapper[4795]: I1205 08:46:16.260117 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:46:16 crc kubenswrapper[4795]: I1205 08:46:16.260523 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:46:16 crc kubenswrapper[4795]: I1205 08:46:16.260358 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:16 crc kubenswrapper[4795]: I1205 08:46:16.260808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:19 crc kubenswrapper[4795]: I1205 08:46:19.155845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 08:46:19 crc kubenswrapper[4795]: I1205 08:46:19.156725 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:46:19 crc kubenswrapper[4795]: I1205 08:46:19.516956 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 05 08:46:19 crc kubenswrapper[4795]: I1205 08:46:19.656745 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.042942 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.043034 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.048249 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575"} pod="openstack/horizon-797f5f5996-7wlp4" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.048373 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" containerID="cri-o://c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575" gracePeriod=30 Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.358997 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.359103 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.360167 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57"} pod="openstack/horizon-57b485fdb4-h9cjs" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.360220 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" containerID="cri-o://6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57" gracePeriod=30 Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.730097 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.730281 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 05 08:46:20 crc kubenswrapper[4795]: I1205 08:46:20.761578 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 05 08:46:27 crc kubenswrapper[4795]: I1205 08:46:27.514906 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerID="40fbd00fd40d345abc4a40b7a91ac0b2679acb18c81e418ee4ab5e07cf7be9d6" exitCode=137 Dec 05 08:46:27 crc kubenswrapper[4795]: I1205 08:46:27.515022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerDied","Data":"40fbd00fd40d345abc4a40b7a91ac0b2679acb18c81e418ee4ab5e07cf7be9d6"} Dec 05 08:46:28 crc kubenswrapper[4795]: E1205 08:46:28.689035 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 05 08:46:28 crc kubenswrapper[4795]: E1205 08:46:28.689402 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8v4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-js4j6_openstack(72fc3705-c3fb-494a-8dc3-9949853a7c1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 08:46:28 crc kubenswrapper[4795]: E1205 08:46:28.690824 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-js4j6" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.048981 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243540 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243816 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243877 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhmzf\" (UniqueName: \"kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.243985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd\") pod \"fa02a411-fef9-4bb7-974a-17b4168d97c2\" (UID: \"fa02a411-fef9-4bb7-974a-17b4168d97c2\") " Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.244961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.247003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.255243 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts" (OuterVolumeSpecName: "scripts") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.278775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf" (OuterVolumeSpecName: "kube-api-access-vhmzf") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "kube-api-access-vhmzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.296856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.347124 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.347163 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhmzf\" (UniqueName: \"kubernetes.io/projected/fa02a411-fef9-4bb7-974a-17b4168d97c2-kube-api-access-vhmzf\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.347174 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fa02a411-fef9-4bb7-974a-17b4168d97c2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.347186 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.347195 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.403575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.414958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data" (OuterVolumeSpecName: "config-data") pod "fa02a411-fef9-4bb7-974a-17b4168d97c2" (UID: "fa02a411-fef9-4bb7-974a-17b4168d97c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.455785 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.455821 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa02a411-fef9-4bb7-974a-17b4168d97c2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.540166 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.542868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fa02a411-fef9-4bb7-974a-17b4168d97c2","Type":"ContainerDied","Data":"ea648398e804a0894569efa31eedad545d5c793535e97bf809bdbc7498dca453"} Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.542950 4795 scope.go:117] "RemoveContainer" containerID="40fbd00fd40d345abc4a40b7a91ac0b2679acb18c81e418ee4ab5e07cf7be9d6" Dec 05 08:46:29 crc kubenswrapper[4795]: E1205 08:46:29.543949 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-js4j6" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.580136 4795 scope.go:117] "RemoveContainer" containerID="c21f0d92ddc55d2246b7a80bdf5aaa3b60e0c5054806fc25421f033730834a3c" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.608647 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.625170 4795 scope.go:117] "RemoveContainer" containerID="e03c3622e6c2701c3d767ab0038d2a9e7a5fc606632c12c0551f91d5147bc0ef" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.637523 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.652560 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:29 crc kubenswrapper[4795]: E1205 08:46:29.653180 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-notification-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653199 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-notification-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: E1205 08:46:29.653213 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="sg-core" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="sg-core" Dec 05 08:46:29 crc kubenswrapper[4795]: E1205 08:46:29.653264 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="proxy-httpd" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653271 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="proxy-httpd" Dec 05 08:46:29 crc kubenswrapper[4795]: E1205 08:46:29.653288 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-central-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653296 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-central-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-central-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653519 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="sg-core" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653537 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="proxy-httpd" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.653549 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" containerName="ceilometer-notification-agent" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.655960 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.661873 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crp57\" (UniqueName: \"kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.664984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.665021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.672124 4795 scope.go:117] "RemoveContainer" containerID="f342314a811195e8347957d7bd25884704a2490fcf0ab179dbae28b256eb8d28" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.675609 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.676471 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766314 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766399 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766426 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766486 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.766524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crp57\" (UniqueName: \"kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.769083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.769255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.777039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.781895 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.788428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crp57\" (UniqueName: \"kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.794101 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:29 crc kubenswrapper[4795]: I1205 08:46:29.796689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data\") pod \"ceilometer-0\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " pod="openstack/ceilometer-0" Dec 05 08:46:30 crc kubenswrapper[4795]: I1205 08:46:30.058475 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:30 crc kubenswrapper[4795]: I1205 08:46:30.699048 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:30 crc kubenswrapper[4795]: W1205 08:46:30.728333 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b7a1736_e594_4359_9cff_90bd13d48c1f.slice/crio-46a811b086d36d7aeeecaff02ddffffc6c91e4ad372ac187555c4a3a1326674e WatchSource:0}: Error finding container 46a811b086d36d7aeeecaff02ddffffc6c91e4ad372ac187555c4a3a1326674e: Status 404 returned error can't find the container with id 46a811b086d36d7aeeecaff02ddffffc6c91e4ad372ac187555c4a3a1326674e Dec 05 08:46:30 crc kubenswrapper[4795]: I1205 08:46:30.761055 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa02a411-fef9-4bb7-974a-17b4168d97c2" path="/var/lib/kubelet/pods/fa02a411-fef9-4bb7-974a-17b4168d97c2/volumes" Dec 05 08:46:31 crc kubenswrapper[4795]: I1205 08:46:31.568988 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerStarted","Data":"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee"} Dec 05 08:46:31 crc kubenswrapper[4795]: I1205 08:46:31.569418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerStarted","Data":"46a811b086d36d7aeeecaff02ddffffc6c91e4ad372ac187555c4a3a1326674e"} Dec 05 08:46:32 crc kubenswrapper[4795]: I1205 08:46:32.581835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerStarted","Data":"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b"} Dec 05 08:46:33 crc kubenswrapper[4795]: I1205 08:46:33.596476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerStarted","Data":"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a"} Dec 05 08:46:35 crc kubenswrapper[4795]: I1205 08:46:35.657150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerStarted","Data":"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f"} Dec 05 08:46:35 crc kubenswrapper[4795]: I1205 08:46:35.658150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:46:35 crc kubenswrapper[4795]: I1205 08:46:35.686980 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.834173502 podStartE2EDuration="6.686955746s" podCreationTimestamp="2025-12-05 08:46:29 +0000 UTC" firstStartedPulling="2025-12-05 08:46:30.732026885 +0000 UTC m=+1342.304630624" lastFinishedPulling="2025-12-05 08:46:34.584809129 +0000 UTC m=+1346.157412868" observedRunningTime="2025-12-05 08:46:35.68672113 +0000 UTC m=+1347.259324879" watchObservedRunningTime="2025-12-05 08:46:35.686955746 +0000 UTC m=+1347.259559485" Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.040334 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.041418 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-central-agent" containerID="cri-o://be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee" gracePeriod=30 Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.042076 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="proxy-httpd" containerID="cri-o://b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f" gracePeriod=30 Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.042131 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="sg-core" containerID="cri-o://57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a" gracePeriod=30 Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.042177 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-notification-agent" containerID="cri-o://e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b" gracePeriod=30 Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.720030 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerID="57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a" exitCode=2 Dec 05 08:46:39 crc kubenswrapper[4795]: I1205 08:46:39.722872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerDied","Data":"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a"} Dec 05 08:46:40 crc kubenswrapper[4795]: I1205 08:46:40.737994 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerID="b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f" exitCode=0 Dec 05 08:46:40 crc kubenswrapper[4795]: I1205 08:46:40.738417 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerID="e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b" exitCode=0 Dec 05 08:46:40 crc kubenswrapper[4795]: I1205 08:46:40.738218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerDied","Data":"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f"} Dec 05 08:46:40 crc kubenswrapper[4795]: I1205 08:46:40.738469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerDied","Data":"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b"} Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.657106 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.670721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.670799 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.670987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.671130 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.671190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crp57\" (UniqueName: \"kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.671244 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.671348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml\") pod \"8b7a1736-e594-4359-9cff-90bd13d48c1f\" (UID: \"8b7a1736-e594-4359-9cff-90bd13d48c1f\") " Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.672125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.672175 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.678773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57" (OuterVolumeSpecName: "kube-api-access-crp57") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "kube-api-access-crp57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.721316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts" (OuterVolumeSpecName: "scripts") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.731773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.774275 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.775064 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.775086 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crp57\" (UniqueName: \"kubernetes.io/projected/8b7a1736-e594-4359-9cff-90bd13d48c1f-kube-api-access-crp57\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.775099 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b7a1736-e594-4359-9cff-90bd13d48c1f-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.775112 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.775750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.805598 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data" (OuterVolumeSpecName: "config-data") pod "8b7a1736-e594-4359-9cff-90bd13d48c1f" (UID: "8b7a1736-e594-4359-9cff-90bd13d48c1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.819387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-js4j6" event={"ID":"72fc3705-c3fb-494a-8dc3-9949853a7c1a","Type":"ContainerStarted","Data":"503704dd0c63698020dbe8d1d29b72e242e762697425ffe87457e0153bc3b70b"} Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.822604 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerID="be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee" exitCode=0 Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.822660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerDied","Data":"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee"} Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.822683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b7a1736-e594-4359-9cff-90bd13d48c1f","Type":"ContainerDied","Data":"46a811b086d36d7aeeecaff02ddffffc6c91e4ad372ac187555c4a3a1326674e"} Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.822701 4795 scope.go:117] "RemoveContainer" containerID="b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.822695 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.842105 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-js4j6" podStartSLOduration=2.296672318 podStartE2EDuration="34.842084059s" podCreationTimestamp="2025-12-05 08:46:11 +0000 UTC" firstStartedPulling="2025-12-05 08:46:12.65900507 +0000 UTC m=+1324.231608819" lastFinishedPulling="2025-12-05 08:46:45.204416821 +0000 UTC m=+1356.777020560" observedRunningTime="2025-12-05 08:46:45.83805734 +0000 UTC m=+1357.410661079" watchObservedRunningTime="2025-12-05 08:46:45.842084059 +0000 UTC m=+1357.414687798" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.849519 4795 scope.go:117] "RemoveContainer" containerID="57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.878256 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.878304 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b7a1736-e594-4359-9cff-90bd13d48c1f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.884720 4795 scope.go:117] "RemoveContainer" containerID="e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.887169 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.900021 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.923801 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:45 crc kubenswrapper[4795]: E1205 08:46:45.924356 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="sg-core" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="sg-core" Dec 05 08:46:45 crc kubenswrapper[4795]: E1205 08:46:45.924413 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-central-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924419 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-central-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: E1205 08:46:45.924430 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="proxy-httpd" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924437 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="proxy-httpd" Dec 05 08:46:45 crc kubenswrapper[4795]: E1205 08:46:45.924449 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-notification-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924456 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-notification-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924698 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-notification-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924706 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="sg-core" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924722 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="proxy-httpd" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.924732 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" containerName="ceilometer-central-agent" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.926607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.928832 4795 scope.go:117] "RemoveContainer" containerID="be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.933523 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.957444 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.960490 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.979856 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.979936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.979971 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswl2\" (UniqueName: \"kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.980000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.980015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.980044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:45 crc kubenswrapper[4795]: I1205 08:46:45.980076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.008215 4795 scope.go:117] "RemoveContainer" containerID="b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f" Dec 05 08:46:46 crc kubenswrapper[4795]: E1205 08:46:46.008846 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f\": container with ID starting with b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f not found: ID does not exist" containerID="b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.008979 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f"} err="failed to get container status \"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f\": rpc error: code = NotFound desc = could not find container \"b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f\": container with ID starting with b1b0f3fa7969e310838ccf0b3c83c87393e556626d16a7f06dd3c597f283734f not found: ID does not exist" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.009191 4795 scope.go:117] "RemoveContainer" containerID="57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a" Dec 05 08:46:46 crc kubenswrapper[4795]: E1205 08:46:46.009994 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a\": container with ID starting with 57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a not found: ID does not exist" containerID="57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.010021 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a"} err="failed to get container status \"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a\": rpc error: code = NotFound desc = could not find container \"57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a\": container with ID starting with 57d97ccca3392fedbcd87cc2843854701163e9804c4fbe46ec3bf490df323d7a not found: ID does not exist" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.010055 4795 scope.go:117] "RemoveContainer" containerID="e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b" Dec 05 08:46:46 crc kubenswrapper[4795]: E1205 08:46:46.011166 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b\": container with ID starting with e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b not found: ID does not exist" containerID="e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.011211 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b"} err="failed to get container status \"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b\": rpc error: code = NotFound desc = could not find container \"e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b\": container with ID starting with e1c786147178ca7ff1daafda9d4a574d758a3a69b44af1c4a9b2184a45666c0b not found: ID does not exist" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.011241 4795 scope.go:117] "RemoveContainer" containerID="be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee" Dec 05 08:46:46 crc kubenswrapper[4795]: E1205 08:46:46.011540 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee\": container with ID starting with be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee not found: ID does not exist" containerID="be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.011564 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee"} err="failed to get container status \"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee\": rpc error: code = NotFound desc = could not find container \"be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee\": container with ID starting with be7a3dda5959244fbfa43146c8ba515c584f3b5728f23f9bbcacb1a72c132aee not found: ID does not exist" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.080926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.080984 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.081015 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.081048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.081119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.081162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.081183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswl2\" (UniqueName: \"kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.083522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.083751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.087058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.087751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.089207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.089827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.100913 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswl2\" (UniqueName: \"kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2\") pod \"ceilometer-0\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.295257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.765333 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7a1736-e594-4359-9cff-90bd13d48c1f" path="/var/lib/kubelet/pods/8b7a1736-e594-4359-9cff-90bd13d48c1f/volumes" Dec 05 08:46:46 crc kubenswrapper[4795]: I1205 08:46:46.837586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:46:46 crc kubenswrapper[4795]: W1205 08:46:46.848979 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd600da3b_a0d0_4ec9_99b8_8bbefcef48f4.slice/crio-d6d4ae88f153f96eb63a29a60a7eb094f57165e59f6d2e34ce37eef6f46f7bfb WatchSource:0}: Error finding container d6d4ae88f153f96eb63a29a60a7eb094f57165e59f6d2e34ce37eef6f46f7bfb: Status 404 returned error can't find the container with id d6d4ae88f153f96eb63a29a60a7eb094f57165e59f6d2e34ce37eef6f46f7bfb Dec 05 08:46:47 crc kubenswrapper[4795]: I1205 08:46:47.856965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerStarted","Data":"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5"} Dec 05 08:46:47 crc kubenswrapper[4795]: I1205 08:46:47.857708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerStarted","Data":"d6d4ae88f153f96eb63a29a60a7eb094f57165e59f6d2e34ce37eef6f46f7bfb"} Dec 05 08:46:48 crc kubenswrapper[4795]: I1205 08:46:48.875363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerStarted","Data":"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb"} Dec 05 08:46:49 crc kubenswrapper[4795]: I1205 08:46:49.889199 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerStarted","Data":"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.902712 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerStarted","Data":"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.904557 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.907081 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575" exitCode=137 Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.907231 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.907352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.907439 4795 scope.go:117] "RemoveContainer" containerID="ecee4fc18281693579f2445417cd59b08213910e3f12f77dc348f4cadec4c8ce" Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.916488 4795 generic.go:334] "Generic (PLEG): container finished" podID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerID="6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57" exitCode=137 Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.916559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerDied","Data":"6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.916602 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"a84791e2ae7b68c09f8fd75787a1361ec4ab9189478ca4102f37be6973b89990"} Dec 05 08:46:50 crc kubenswrapper[4795]: I1205 08:46:50.949903 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9146333909999997 podStartE2EDuration="5.9498766s" podCreationTimestamp="2025-12-05 08:46:45 +0000 UTC" firstStartedPulling="2025-12-05 08:46:46.85607627 +0000 UTC m=+1358.428680009" lastFinishedPulling="2025-12-05 08:46:49.891319479 +0000 UTC m=+1361.463923218" observedRunningTime="2025-12-05 08:46:50.926625464 +0000 UTC m=+1362.499229213" watchObservedRunningTime="2025-12-05 08:46:50.9498766 +0000 UTC m=+1362.522480339" Dec 05 08:46:51 crc kubenswrapper[4795]: I1205 08:46:51.148780 4795 scope.go:117] "RemoveContainer" containerID="be19f62cf7c60fe65931433e0a5734a5bcb27c66fffa49ad918909cee2adf63a" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.029205 4795 generic.go:334] "Generic (PLEG): container finished" podID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" containerID="503704dd0c63698020dbe8d1d29b72e242e762697425ffe87457e0153bc3b70b" exitCode=0 Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.029418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-js4j6" event={"ID":"72fc3705-c3fb-494a-8dc3-9949853a7c1a","Type":"ContainerDied","Data":"503704dd0c63698020dbe8d1d29b72e242e762697425ffe87457e0153bc3b70b"} Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.032643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.033023 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.035115 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.359137 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.359192 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:47:00 crc kubenswrapper[4795]: I1205 08:47:00.362178 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.470120 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.659124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle\") pod \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.659210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data\") pod \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.659481 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8v4p\" (UniqueName: \"kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p\") pod \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.659506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts\") pod \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\" (UID: \"72fc3705-c3fb-494a-8dc3-9949853a7c1a\") " Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.680885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p" (OuterVolumeSpecName: "kube-api-access-m8v4p") pod "72fc3705-c3fb-494a-8dc3-9949853a7c1a" (UID: "72fc3705-c3fb-494a-8dc3-9949853a7c1a"). InnerVolumeSpecName "kube-api-access-m8v4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.697858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts" (OuterVolumeSpecName: "scripts") pod "72fc3705-c3fb-494a-8dc3-9949853a7c1a" (UID: "72fc3705-c3fb-494a-8dc3-9949853a7c1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.739084 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data" (OuterVolumeSpecName: "config-data") pod "72fc3705-c3fb-494a-8dc3-9949853a7c1a" (UID: "72fc3705-c3fb-494a-8dc3-9949853a7c1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.763850 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.763890 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8v4p\" (UniqueName: \"kubernetes.io/projected/72fc3705-c3fb-494a-8dc3-9949853a7c1a-kube-api-access-m8v4p\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.763904 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.837917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72fc3705-c3fb-494a-8dc3-9949853a7c1a" (UID: "72fc3705-c3fb-494a-8dc3-9949853a7c1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:01 crc kubenswrapper[4795]: I1205 08:47:01.866510 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fc3705-c3fb-494a-8dc3-9949853a7c1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.051872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-js4j6" event={"ID":"72fc3705-c3fb-494a-8dc3-9949853a7c1a","Type":"ContainerDied","Data":"b071520c637fe712012e524ee8a7f4f16e40093b9c8a472a41bdeed65801f4c4"} Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.051932 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b071520c637fe712012e524ee8a7f4f16e40093b9c8a472a41bdeed65801f4c4" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.051953 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-js4j6" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.256720 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 08:47:02 crc kubenswrapper[4795]: E1205 08:47:02.257252 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" containerName="nova-cell0-conductor-db-sync" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.257271 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" containerName="nova-cell0-conductor-db-sync" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.257474 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" containerName="nova-cell0-conductor-db-sync" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.258258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.261272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-x7wcp" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.261905 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.274226 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.375864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.375927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8ph\" (UniqueName: \"kubernetes.io/projected/d04716e6-118a-45f7-b0a2-038650cb3baf-kube-api-access-px8ph\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.375966 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.478086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.478141 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8ph\" (UniqueName: \"kubernetes.io/projected/d04716e6-118a-45f7-b0a2-038650cb3baf-kube-api-access-px8ph\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.478167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.484404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.491408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04716e6-118a-45f7-b0a2-038650cb3baf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.502699 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8ph\" (UniqueName: \"kubernetes.io/projected/d04716e6-118a-45f7-b0a2-038650cb3baf-kube-api-access-px8ph\") pod \"nova-cell0-conductor-0\" (UID: \"d04716e6-118a-45f7-b0a2-038650cb3baf\") " pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:02 crc kubenswrapper[4795]: I1205 08:47:02.585411 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:03 crc kubenswrapper[4795]: I1205 08:47:03.146296 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 05 08:47:04 crc kubenswrapper[4795]: I1205 08:47:04.073996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d04716e6-118a-45f7-b0a2-038650cb3baf","Type":"ContainerStarted","Data":"c142e35c9c6dffaf6286634578fec122bbb87d8c74c43a726403f4f68a4cedef"} Dec 05 08:47:04 crc kubenswrapper[4795]: I1205 08:47:04.074475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:04 crc kubenswrapper[4795]: I1205 08:47:04.074490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d04716e6-118a-45f7-b0a2-038650cb3baf","Type":"ContainerStarted","Data":"6ec923b07d370633301531e909fca55456e1a0b972674a9b1aae2d87a17a96bf"} Dec 05 08:47:04 crc kubenswrapper[4795]: I1205 08:47:04.102101 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.102079674 podStartE2EDuration="2.102079674s" podCreationTimestamp="2025-12-05 08:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:04.096696389 +0000 UTC m=+1375.669300128" watchObservedRunningTime="2025-12-05 08:47:04.102079674 +0000 UTC m=+1375.674683413" Dec 05 08:47:10 crc kubenswrapper[4795]: I1205 08:47:10.033435 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:47:10 crc kubenswrapper[4795]: I1205 08:47:10.359836 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:47:12 crc kubenswrapper[4795]: I1205 08:47:12.618503 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.168078 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-ltnvx"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.176975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.184496 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.184658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.228498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ltnvx"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.348216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xlsv\" (UniqueName: \"kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.348298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.348347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.348396 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.412943 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.414977 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.427910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.451496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xlsv\" (UniqueName: \"kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.451574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.451634 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.451680 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.462664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.463854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.468761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.504567 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.526920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xlsv\" (UniqueName: \"kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv\") pod \"nova-cell0-cell-mapping-ltnvx\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.562227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9brjc\" (UniqueName: \"kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.562309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.562375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.562459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.657646 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.660495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.664190 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9brjc\" (UniqueName: \"kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.664238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.664277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.664327 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.665082 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.665790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.687210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.688104 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.716426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9brjc\" (UniqueName: \"kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc\") pod \"nova-api-0\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.735699 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.747495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.771166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qmb\" (UniqueName: \"kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.771234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.771300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.822228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.868119 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.870111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.873827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.874825 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qmb\" (UniqueName: \"kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.874966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.878112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.888711 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.890327 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.978112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.986341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.986630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p94bf\" (UniqueName: \"kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.987257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:13 crc kubenswrapper[4795]: I1205 08:47:13.994706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.042458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qmb\" (UniqueName: \"kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb\") pod \"nova-scheduler-0\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.089322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.089550 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.089596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.089641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p94bf\" (UniqueName: \"kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.093901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.105458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.112860 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.135224 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.215440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p94bf\" (UniqueName: \"kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf\") pod \"nova-metadata-0\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.308582 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.320121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.348022 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.365288 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.502202 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.502576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.503816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.504139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mccvm\" (UniqueName: \"kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.511692 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.526598 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.574360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.623185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mccvm\" (UniqueName: \"kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.637818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbt62\" (UniqueName: \"kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.693908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.695406 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.701568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mccvm\" (UniqueName: \"kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm\") pod \"nova-cell1-novncproxy-0\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbt62\" (UniqueName: \"kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.740762 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.742554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.743239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.744843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.746499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.746707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.797499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbt62\" (UniqueName: \"kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62\") pod \"dnsmasq-dns-bccf8f775-ltkvs\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.924138 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:14 crc kubenswrapper[4795]: I1205 08:47:14.972762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:15 crc kubenswrapper[4795]: I1205 08:47:15.096342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:15 crc kubenswrapper[4795]: W1205 08:47:15.194488 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bf1ff3f_1907_4a4c_8ae0_e028e37f1022.slice/crio-a08e2453fa24be7481f5c1c8b0149ae6d3324c812164aeab6db6930050f69f5a WatchSource:0}: Error finding container a08e2453fa24be7481f5c1c8b0149ae6d3324c812164aeab6db6930050f69f5a: Status 404 returned error can't find the container with id a08e2453fa24be7481f5c1c8b0149ae6d3324c812164aeab6db6930050f69f5a Dec 05 08:47:15 crc kubenswrapper[4795]: I1205 08:47:15.295208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerStarted","Data":"a08e2453fa24be7481f5c1c8b0149ae6d3324c812164aeab6db6930050f69f5a"} Dec 05 08:47:15 crc kubenswrapper[4795]: I1205 08:47:15.301486 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-ltnvx"] Dec 05 08:47:15 crc kubenswrapper[4795]: I1205 08:47:15.980429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.021278 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.135005 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.204675 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.309529 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.315906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" event={"ID":"244cc672-47b5-4bdf-9466-f2f0f7c73e8f","Type":"ContainerStarted","Data":"d595575fc4ae0fffc080659e83e924a00f3208aedf13b3e7a7c81e8e296c96e2"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.332264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ltnvx" event={"ID":"f2bcede8-6fd6-409e-83ef-306a7912dc7f","Type":"ContainerStarted","Data":"83bd713295be536c586766caaae00dfd52bb90420b425a3c3e0a75afdc216cbe"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.332320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ltnvx" event={"ID":"f2bcede8-6fd6-409e-83ef-306a7912dc7f","Type":"ContainerStarted","Data":"4479a61d41e434a6b7b9539a87893d4b1523d327df8adc8d57d35af87f2d47b3"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.341139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerStarted","Data":"3debae000473f101e626e41739d4f713cf91759920c22f89fa6dff482b3b5cfa"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.360016 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d227940-c98f-416a-bcfb-460e243138b3","Type":"ContainerStarted","Data":"d32cf4a86a8dc9627a8a259fb723e6a336b0bfc6a9762b0b30506f289dd98101"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.379525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6060b4e3-f743-45b7-b401-0e01950aadd8","Type":"ContainerStarted","Data":"33b44d9d0fd56540faf7b3066eef06e8c19985c479bb234316828de447974aff"} Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.804696 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-ltnvx" podStartSLOduration=3.804666799 podStartE2EDuration="3.804666799s" podCreationTimestamp="2025-12-05 08:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:16.432504428 +0000 UTC m=+1388.005108167" watchObservedRunningTime="2025-12-05 08:47:16.804666799 +0000 UTC m=+1388.377270538" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.820132 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4qr2p"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.826632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.832118 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.832334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.843824 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4qr2p"] Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.969399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwj2h\" (UniqueName: \"kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.969581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.969637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:16 crc kubenswrapper[4795]: I1205 08:47:16.969667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.071972 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwj2h\" (UniqueName: \"kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.072119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.072159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.072185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.086983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.090788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.101564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.104890 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwj2h\" (UniqueName: \"kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h\") pod \"nova-cell1-conductor-db-sync-4qr2p\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.174785 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.425413 4795 generic.go:334] "Generic (PLEG): container finished" podID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerID="1e16a0455b8e7f7004d24d7f988a319f680124b57bcf502e280e053f4f21b33a" exitCode=0 Dec 05 08:47:17 crc kubenswrapper[4795]: I1205 08:47:17.425684 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" event={"ID":"244cc672-47b5-4bdf-9466-f2f0f7c73e8f","Type":"ContainerDied","Data":"1e16a0455b8e7f7004d24d7f988a319f680124b57bcf502e280e053f4f21b33a"} Dec 05 08:47:18 crc kubenswrapper[4795]: I1205 08:47:18.220329 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4qr2p"] Dec 05 08:47:18 crc kubenswrapper[4795]: I1205 08:47:18.477756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" event={"ID":"4d614be9-37fb-439a-895d-eb5c92210497","Type":"ContainerStarted","Data":"c00d37689d33b77d8c5b8c6920a4f54ce3fded9759f93f6432d3be40a0b9f224"} Dec 05 08:47:18 crc kubenswrapper[4795]: I1205 08:47:18.506256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" event={"ID":"244cc672-47b5-4bdf-9466-f2f0f7c73e8f","Type":"ContainerStarted","Data":"fb51cdfd590896e959394cafa5039139a64c7e35c084178852d0332dad2669e1"} Dec 05 08:47:18 crc kubenswrapper[4795]: I1205 08:47:18.507809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:18 crc kubenswrapper[4795]: I1205 08:47:18.559483 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" podStartSLOduration=4.559454486 podStartE2EDuration="4.559454486s" podCreationTimestamp="2025-12-05 08:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:18.544380109 +0000 UTC m=+1390.116983848" watchObservedRunningTime="2025-12-05 08:47:18.559454486 +0000 UTC m=+1390.132058225" Dec 05 08:47:19 crc kubenswrapper[4795]: I1205 08:47:19.245995 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:19 crc kubenswrapper[4795]: I1205 08:47:19.273363 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:19 crc kubenswrapper[4795]: I1205 08:47:19.536005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" event={"ID":"4d614be9-37fb-439a-895d-eb5c92210497","Type":"ContainerStarted","Data":"10cb78d4de0eb2cead73b25d4b22b2aa00b82621ad1e9971d37f2c0171c767a1"} Dec 05 08:47:19 crc kubenswrapper[4795]: I1205 08:47:19.562812 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" podStartSLOduration=3.5627868190000003 podStartE2EDuration="3.562786819s" podCreationTimestamp="2025-12-05 08:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:19.554585618 +0000 UTC m=+1391.127189357" watchObservedRunningTime="2025-12-05 08:47:19.562786819 +0000 UTC m=+1391.135390558" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.033894 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.034404 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.035527 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4"} pod="openstack/horizon-797f5f5996-7wlp4" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.035643 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" containerID="cri-o://2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4" gracePeriod=30 Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.359800 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.359943 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.361130 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a84791e2ae7b68c09f8fd75787a1361ec4ab9189478ca4102f37be6973b89990"} pod="openstack/horizon-57b485fdb4-h9cjs" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:47:20 crc kubenswrapper[4795]: I1205 08:47:20.361191 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" containerID="cri-o://a84791e2ae7b68c09f8fd75787a1361ec4ab9189478ca4102f37be6973b89990" gracePeriod=30 Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.588068 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerStarted","Data":"af012a9d9693925faa3d6d13d49614b43bba27225e84b1d8228d62a2caa3e7c2"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.589034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerStarted","Data":"66b8d8120f4f47fd7f2d92e33996197c4f39417fa0f45351a5c59bafddbbe476"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.594543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d227940-c98f-416a-bcfb-460e243138b3","Type":"ContainerStarted","Data":"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.599307 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6060b4e3-f743-45b7-b401-0e01950aadd8","Type":"ContainerStarted","Data":"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.599502 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6060b4e3-f743-45b7-b401-0e01950aadd8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f" gracePeriod=30 Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.606778 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerStarted","Data":"e0fa72de53dc1ad92f9cb5cb6285c67be3685979872e0395ea1406815604c7ea"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.606809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerStarted","Data":"b598d3b7932243c20810b6e4bc88a2002d7aea4ca0187c81a917ed10053025ba"} Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.606907 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-log" containerID="cri-o://b598d3b7932243c20810b6e4bc88a2002d7aea4ca0187c81a917ed10053025ba" gracePeriod=30 Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.607101 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-metadata" containerID="cri-o://e0fa72de53dc1ad92f9cb5cb6285c67be3685979872e0395ea1406815604c7ea" gracePeriod=30 Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.624592 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.148486483 podStartE2EDuration="10.624568057s" podCreationTimestamp="2025-12-05 08:47:13 +0000 UTC" firstStartedPulling="2025-12-05 08:47:15.21749447 +0000 UTC m=+1386.790098209" lastFinishedPulling="2025-12-05 08:47:22.693576044 +0000 UTC m=+1394.266179783" observedRunningTime="2025-12-05 08:47:23.614333181 +0000 UTC m=+1395.186936910" watchObservedRunningTime="2025-12-05 08:47:23.624568057 +0000 UTC m=+1395.197171796" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.651212 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.083497114 podStartE2EDuration="10.651191514s" podCreationTimestamp="2025-12-05 08:47:13 +0000 UTC" firstStartedPulling="2025-12-05 08:47:16.128209246 +0000 UTC m=+1387.700812975" lastFinishedPulling="2025-12-05 08:47:22.695903636 +0000 UTC m=+1394.268507375" observedRunningTime="2025-12-05 08:47:23.639408107 +0000 UTC m=+1395.212011846" watchObservedRunningTime="2025-12-05 08:47:23.651191514 +0000 UTC m=+1395.223795253" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.678195 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.000030706 podStartE2EDuration="10.678169242s" podCreationTimestamp="2025-12-05 08:47:13 +0000 UTC" firstStartedPulling="2025-12-05 08:47:16.017828442 +0000 UTC m=+1387.590432181" lastFinishedPulling="2025-12-05 08:47:22.695966978 +0000 UTC m=+1394.268570717" observedRunningTime="2025-12-05 08:47:23.669804526 +0000 UTC m=+1395.242408265" watchObservedRunningTime="2025-12-05 08:47:23.678169242 +0000 UTC m=+1395.250772981" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.714954 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.259107108 podStartE2EDuration="9.714924242s" podCreationTimestamp="2025-12-05 08:47:14 +0000 UTC" firstStartedPulling="2025-12-05 08:47:16.244853561 +0000 UTC m=+1387.817457310" lastFinishedPulling="2025-12-05 08:47:22.700670705 +0000 UTC m=+1394.273274444" observedRunningTime="2025-12-05 08:47:23.713134255 +0000 UTC m=+1395.285738004" watchObservedRunningTime="2025-12-05 08:47:23.714924242 +0000 UTC m=+1395.287527991" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.751286 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.751331 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.757053 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": dial tcp 10.217.0.184:8774: connect: connection refused" Dec 05 08:47:23 crc kubenswrapper[4795]: I1205 08:47:23.757103 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": dial tcp 10.217.0.184:8774: connect: connection refused" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.136591 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.136661 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.176889 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.528312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.528380 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.620375 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8713212-260c-47e9-ace2-16535a611c7b" containerID="b598d3b7932243c20810b6e4bc88a2002d7aea4ca0187c81a917ed10053025ba" exitCode=143 Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.621765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerDied","Data":"b598d3b7932243c20810b6e4bc88a2002d7aea4ca0187c81a917ed10053025ba"} Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.822260 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.925819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:47:24 crc kubenswrapper[4795]: I1205 08:47:24.973964 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:25 crc kubenswrapper[4795]: I1205 08:47:25.111171 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:47:25 crc kubenswrapper[4795]: I1205 08:47:25.111580 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="dnsmasq-dns" containerID="cri-o://8db1beb533092d33bd20b6710443e54a5a52f8bf4df24d6481e3c74877db0617" gracePeriod=10 Dec 05 08:47:25 crc kubenswrapper[4795]: I1205 08:47:25.638733 4795 generic.go:334] "Generic (PLEG): container finished" podID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerID="8db1beb533092d33bd20b6710443e54a5a52f8bf4df24d6481e3c74877db0617" exitCode=0 Dec 05 08:47:25 crc kubenswrapper[4795]: I1205 08:47:25.638855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" event={"ID":"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2","Type":"ContainerDied","Data":"8db1beb533092d33bd20b6710443e54a5a52f8bf4df24d6481e3c74877db0617"} Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.521006 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.604386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.604467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.604872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4rdf\" (UniqueName: \"kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.604919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.604993 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.605137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc\") pod \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\" (UID: \"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2\") " Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.674983 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf" (OuterVolumeSpecName: "kube-api-access-z4rdf") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "kube-api-access-z4rdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.682933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.683523 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-k8zsh" event={"ID":"bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2","Type":"ContainerDied","Data":"66ab8089d0897d982673c4da63a676e0455d43a90ebd4bd23ab939514472da33"} Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.683570 4795 scope.go:117] "RemoveContainer" containerID="8db1beb533092d33bd20b6710443e54a5a52f8bf4df24d6481e3c74877db0617" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.710031 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4rdf\" (UniqueName: \"kubernetes.io/projected/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-kube-api-access-z4rdf\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.738942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.755306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.766923 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config" (OuterVolumeSpecName: "config") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.781595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.801997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" (UID: "bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.814431 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.814471 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.814482 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.814492 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.814501 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.852412 4795 scope.go:117] "RemoveContainer" containerID="b5a017bb4279a9b4099caf917181a06cc26e7799a700a0b689eebfb01b8b0c8d" Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.905656 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:26 crc kubenswrapper[4795]: I1205 08:47:26.906713 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" containerName="kube-state-metrics" containerID="cri-o://0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1" gracePeriod=30 Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.033347 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.042867 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-k8zsh"] Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.570704 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.644966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txz62\" (UniqueName: \"kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62\") pod \"817d20b1-4cfa-4cae-98ae-cf2e4f379726\" (UID: \"817d20b1-4cfa-4cae-98ae-cf2e4f379726\") " Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.677031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62" (OuterVolumeSpecName: "kube-api-access-txz62") pod "817d20b1-4cfa-4cae-98ae-cf2e4f379726" (UID: "817d20b1-4cfa-4cae-98ae-cf2e4f379726"). InnerVolumeSpecName "kube-api-access-txz62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.713814 4795 generic.go:334] "Generic (PLEG): container finished" podID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" containerID="0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1" exitCode=2 Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.714856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.716044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"817d20b1-4cfa-4cae-98ae-cf2e4f379726","Type":"ContainerDied","Data":"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1"} Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.716095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"817d20b1-4cfa-4cae-98ae-cf2e4f379726","Type":"ContainerDied","Data":"93de357d1f047405b0144d31e5f9c7ef9d27f9d87dcd954a20fcfcd0690b5228"} Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.716118 4795 scope.go:117] "RemoveContainer" containerID="0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.748132 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txz62\" (UniqueName: \"kubernetes.io/projected/817d20b1-4cfa-4cae-98ae-cf2e4f379726-kube-api-access-txz62\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.784143 4795 scope.go:117] "RemoveContainer" containerID="0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1" Dec 05 08:47:27 crc kubenswrapper[4795]: E1205 08:47:27.784690 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1\": container with ID starting with 0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1 not found: ID does not exist" containerID="0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.784727 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1"} err="failed to get container status \"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1\": rpc error: code = NotFound desc = could not find container \"0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1\": container with ID starting with 0b526a3a839a5fc2d90eaec837db2376a427e6641353e83e66e985de868965c1 not found: ID does not exist" Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.831527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:27 crc kubenswrapper[4795]: I1205 08:47:27.912704 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.133039 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:28 crc kubenswrapper[4795]: E1205 08:47:28.134309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="init" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.134337 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="init" Dec 05 08:47:28 crc kubenswrapper[4795]: E1205 08:47:28.134396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="dnsmasq-dns" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.134407 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="dnsmasq-dns" Dec 05 08:47:28 crc kubenswrapper[4795]: E1205 08:47:28.134427 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" containerName="kube-state-metrics" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.134437 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" containerName="kube-state-metrics" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.134905 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" containerName="kube-state-metrics" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.134956 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" containerName="dnsmasq-dns" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.166407 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.183199 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.183517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.204606 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.294137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.294237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.294277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.294341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqrc\" (UniqueName: \"kubernetes.io/projected/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-api-access-5hqrc\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.396807 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqrc\" (UniqueName: \"kubernetes.io/projected/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-api-access-5hqrc\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.397367 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.397506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.397639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.415409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.417053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.417790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqrc\" (UniqueName: \"kubernetes.io/projected/1fd34f8e-69eb-4af2-ae44-4da71219ef35-kube-api-access-5hqrc\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.428205 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd34f8e-69eb-4af2-ae44-4da71219ef35-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fd34f8e-69eb-4af2-ae44-4da71219ef35\") " pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.552234 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.761290 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817d20b1-4cfa-4cae-98ae-cf2e4f379726" path="/var/lib/kubelet/pods/817d20b1-4cfa-4cae-98ae-cf2e4f379726/volumes" Dec 05 08:47:28 crc kubenswrapper[4795]: I1205 08:47:28.763469 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2" path="/var/lib/kubelet/pods/bc69c8e1-899b-4dc2-a28e-7aeab9c9f3b2/volumes" Dec 05 08:47:29 crc kubenswrapper[4795]: I1205 08:47:29.238165 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 05 08:47:29 crc kubenswrapper[4795]: I1205 08:47:29.744784 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd34f8e-69eb-4af2-ae44-4da71219ef35","Type":"ContainerStarted","Data":"7e66f42ab2eb260d7304294733a1a097e0852675afffd72acab170d4dd5e9072"} Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.569720 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.573496 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-central-agent" containerID="cri-o://9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5" gracePeriod=30 Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.573606 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="sg-core" containerID="cri-o://62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2" gracePeriod=30 Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.573544 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="proxy-httpd" containerID="cri-o://6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9" gracePeriod=30 Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.573578 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-notification-agent" containerID="cri-o://f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb" gracePeriod=30 Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.762190 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.762249 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fd34f8e-69eb-4af2-ae44-4da71219ef35","Type":"ContainerStarted","Data":"3b19000b4c30e96b26a9014c270f829b158f818b43c99ad957eafeb2f26e9dfe"} Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.767034 4795 generic.go:334] "Generic (PLEG): container finished" podID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerID="62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2" exitCode=2 Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.767104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerDied","Data":"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2"} Dec 05 08:47:30 crc kubenswrapper[4795]: I1205 08:47:30.809605 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.092822698 podStartE2EDuration="3.809575376s" podCreationTimestamp="2025-12-05 08:47:27 +0000 UTC" firstStartedPulling="2025-12-05 08:47:29.251728518 +0000 UTC m=+1400.824332257" lastFinishedPulling="2025-12-05 08:47:29.968481196 +0000 UTC m=+1401.541084935" observedRunningTime="2025-12-05 08:47:30.782365293 +0000 UTC m=+1402.354969042" watchObservedRunningTime="2025-12-05 08:47:30.809575376 +0000 UTC m=+1402.382179115" Dec 05 08:47:31 crc kubenswrapper[4795]: I1205 08:47:31.794406 4795 generic.go:334] "Generic (PLEG): container finished" podID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerID="6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9" exitCode=0 Dec 05 08:47:31 crc kubenswrapper[4795]: I1205 08:47:31.794804 4795 generic.go:334] "Generic (PLEG): container finished" podID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerID="9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5" exitCode=0 Dec 05 08:47:31 crc kubenswrapper[4795]: I1205 08:47:31.794693 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerDied","Data":"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9"} Dec 05 08:47:31 crc kubenswrapper[4795]: I1205 08:47:31.794958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerDied","Data":"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5"} Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.354956 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.474930 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mswl2\" (UniqueName: \"kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475208 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475382 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.475576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd\") pod \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\" (UID: \"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4\") " Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.477171 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.477799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.488031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts" (OuterVolumeSpecName: "scripts") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.516377 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2" (OuterVolumeSpecName: "kube-api-access-mswl2") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "kube-api-access-mswl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.551631 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.581307 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mswl2\" (UniqueName: \"kubernetes.io/projected/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-kube-api-access-mswl2\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.581354 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.581364 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.581374 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.581383 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.614183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.668136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data" (OuterVolumeSpecName: "config-data") pod "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" (UID: "d600da3b-a0d0-4ec9-99b8-8bbefcef48f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.683818 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.683863 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.819399 4795 generic.go:334] "Generic (PLEG): container finished" podID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerID="f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb" exitCode=0 Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.819504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerDied","Data":"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb"} Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.819549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d600da3b-a0d0-4ec9-99b8-8bbefcef48f4","Type":"ContainerDied","Data":"d6d4ae88f153f96eb63a29a60a7eb094f57165e59f6d2e34ce37eef6f46f7bfb"} Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.819851 4795 scope.go:117] "RemoveContainer" containerID="6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.820048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.823507 4795 generic.go:334] "Generic (PLEG): container finished" podID="f2bcede8-6fd6-409e-83ef-306a7912dc7f" containerID="83bd713295be536c586766caaae00dfd52bb90420b425a3c3e0a75afdc216cbe" exitCode=0 Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.823659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ltnvx" event={"ID":"f2bcede8-6fd6-409e-83ef-306a7912dc7f","Type":"ContainerDied","Data":"83bd713295be536c586766caaae00dfd52bb90420b425a3c3e0a75afdc216cbe"} Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.894147 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.942354 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.958235 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:33 crc kubenswrapper[4795]: E1205 08:47:33.959049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="proxy-httpd" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.959134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="proxy-httpd" Dec 05 08:47:33 crc kubenswrapper[4795]: E1205 08:47:33.959205 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-notification-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.959277 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-notification-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: E1205 08:47:33.959356 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-central-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.959432 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-central-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: E1205 08:47:33.959499 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="sg-core" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.959560 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="sg-core" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.959874 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="proxy-httpd" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.960010 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="sg-core" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.960109 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-central-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.960175 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" containerName="ceilometer-notification-agent" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.962400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.969526 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.970255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.970473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:47:33 crc kubenswrapper[4795]: I1205 08:47:33.974063 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106121 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106364 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqgmb\" (UniqueName: \"kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.106646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.172141 4795 scope.go:117] "RemoveContainer" containerID="62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgmb\" (UniqueName: \"kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.209697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.210432 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.211998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.220149 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.229159 4795 scope.go:117] "RemoveContainer" containerID="f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.233337 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.234339 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.234912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.236504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.238171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqgmb\" (UniqueName: \"kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb\") pod \"ceilometer-0\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.302389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.436993 4795 scope.go:117] "RemoveContainer" containerID="9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.468562 4795 scope.go:117] "RemoveContainer" containerID="6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9" Dec 05 08:47:34 crc kubenswrapper[4795]: E1205 08:47:34.473550 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9\": container with ID starting with 6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9 not found: ID does not exist" containerID="6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.473654 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9"} err="failed to get container status \"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9\": rpc error: code = NotFound desc = could not find container \"6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9\": container with ID starting with 6eaaf1e01ebce1edd7182e5a154f0dedcb22e38fa7713d7ae0fca08a6cf4cce9 not found: ID does not exist" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.473693 4795 scope.go:117] "RemoveContainer" containerID="62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2" Dec 05 08:47:34 crc kubenswrapper[4795]: E1205 08:47:34.474090 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2\": container with ID starting with 62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2 not found: ID does not exist" containerID="62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.474121 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2"} err="failed to get container status \"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2\": rpc error: code = NotFound desc = could not find container \"62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2\": container with ID starting with 62fc41c07cbbb9496e5932582ff08f87d6ff4298c4025978aebb10e2fa5c6ca2 not found: ID does not exist" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.474136 4795 scope.go:117] "RemoveContainer" containerID="f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb" Dec 05 08:47:34 crc kubenswrapper[4795]: E1205 08:47:34.474471 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb\": container with ID starting with f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb not found: ID does not exist" containerID="f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.474513 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb"} err="failed to get container status \"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb\": rpc error: code = NotFound desc = could not find container \"f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb\": container with ID starting with f806688f3adac0b6840009e7335a0fd905a6c71f6f437a26ebf3076d9843e1fb not found: ID does not exist" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.474530 4795 scope.go:117] "RemoveContainer" containerID="9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5" Dec 05 08:47:34 crc kubenswrapper[4795]: E1205 08:47:34.475109 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5\": container with ID starting with 9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5 not found: ID does not exist" containerID="9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.475147 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5"} err="failed to get container status \"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5\": rpc error: code = NotFound desc = could not find container \"9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5\": container with ID starting with 9237cc909c7945544da3cd6d4e084dff2c2c58b31012b56d436a9d2eb33511a5 not found: ID does not exist" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.762711 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d600da3b-a0d0-4ec9-99b8-8bbefcef48f4" path="/var/lib/kubelet/pods/d600da3b-a0d0-4ec9-99b8-8bbefcef48f4/volumes" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.831970 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.832766 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:47:34 crc kubenswrapper[4795]: I1205 08:47:34.989793 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.384765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.482184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data\") pod \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.482779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts\") pod \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.482902 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xlsv\" (UniqueName: \"kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv\") pod \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.485954 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle\") pod \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\" (UID: \"f2bcede8-6fd6-409e-83ef-306a7912dc7f\") " Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.493368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv" (OuterVolumeSpecName: "kube-api-access-5xlsv") pod "f2bcede8-6fd6-409e-83ef-306a7912dc7f" (UID: "f2bcede8-6fd6-409e-83ef-306a7912dc7f"). InnerVolumeSpecName "kube-api-access-5xlsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.512036 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts" (OuterVolumeSpecName: "scripts") pod "f2bcede8-6fd6-409e-83ef-306a7912dc7f" (UID: "f2bcede8-6fd6-409e-83ef-306a7912dc7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.563917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data" (OuterVolumeSpecName: "config-data") pod "f2bcede8-6fd6-409e-83ef-306a7912dc7f" (UID: "f2bcede8-6fd6-409e-83ef-306a7912dc7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.565848 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2bcede8-6fd6-409e-83ef-306a7912dc7f" (UID: "f2bcede8-6fd6-409e-83ef-306a7912dc7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.590238 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.590291 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xlsv\" (UniqueName: \"kubernetes.io/projected/f2bcede8-6fd6-409e-83ef-306a7912dc7f-kube-api-access-5xlsv\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.590309 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.590321 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bcede8-6fd6-409e-83ef-306a7912dc7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.851342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-ltnvx" event={"ID":"f2bcede8-6fd6-409e-83ef-306a7912dc7f","Type":"ContainerDied","Data":"4479a61d41e434a6b7b9539a87893d4b1523d327df8adc8d57d35af87f2d47b3"} Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.851767 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4479a61d41e434a6b7b9539a87893d4b1523d327df8adc8d57d35af87f2d47b3" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.851870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-ltnvx" Dec 05 08:47:35 crc kubenswrapper[4795]: I1205 08:47:35.855431 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerStarted","Data":"cc2b5ce7a2f3f148d185f794b4a65a98bb7420c7bc8ab4032294a0ae7a783900"} Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.159897 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.161277 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" containerID="cri-o://af012a9d9693925faa3d6d13d49614b43bba27225e84b1d8228d62a2caa3e7c2" gracePeriod=30 Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.161239 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" containerID="cri-o://66b8d8120f4f47fd7f2d92e33996197c4f39417fa0f45351a5c59bafddbbe476" gracePeriod=30 Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.291693 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.292393 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4d227940-c98f-416a-bcfb-460e243138b3" containerName="nova-scheduler-scheduler" containerID="cri-o://8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" gracePeriod=30 Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.869326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerStarted","Data":"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946"} Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.873673 4795 generic.go:334] "Generic (PLEG): container finished" podID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerID="66b8d8120f4f47fd7f2d92e33996197c4f39417fa0f45351a5c59bafddbbe476" exitCode=143 Dec 05 08:47:36 crc kubenswrapper[4795]: I1205 08:47:36.873730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerDied","Data":"66b8d8120f4f47fd7f2d92e33996197c4f39417fa0f45351a5c59bafddbbe476"} Dec 05 08:47:37 crc kubenswrapper[4795]: I1205 08:47:37.889729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerStarted","Data":"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7"} Dec 05 08:47:38 crc kubenswrapper[4795]: I1205 08:47:38.570429 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 05 08:47:38 crc kubenswrapper[4795]: I1205 08:47:38.903310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerStarted","Data":"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f"} Dec 05 08:47:39 crc kubenswrapper[4795]: E1205 08:47:39.149867 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:47:39 crc kubenswrapper[4795]: E1205 08:47:39.156761 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:47:39 crc kubenswrapper[4795]: E1205 08:47:39.161726 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:47:39 crc kubenswrapper[4795]: E1205 08:47:39.161803 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4d227940-c98f-416a-bcfb-460e243138b3" containerName="nova-scheduler-scheduler" Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.915983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.928224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerStarted","Data":"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f"} Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.929560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.932115 4795 generic.go:334] "Generic (PLEG): container finished" podID="4d227940-c98f-416a-bcfb-460e243138b3" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" exitCode=0 Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.932149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d227940-c98f-416a-bcfb-460e243138b3","Type":"ContainerDied","Data":"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b"} Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.932170 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4d227940-c98f-416a-bcfb-460e243138b3","Type":"ContainerDied","Data":"d32cf4a86a8dc9627a8a259fb723e6a336b0bfc6a9762b0b30506f289dd98101"} Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.932192 4795 scope.go:117] "RemoveContainer" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" Dec 05 08:47:39 crc kubenswrapper[4795]: I1205 08:47:39.932310 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.010269 4795 scope.go:117] "RemoveContainer" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" Dec 05 08:47:40 crc kubenswrapper[4795]: E1205 08:47:40.016014 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b\": container with ID starting with 8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b not found: ID does not exist" containerID="8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.016067 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b"} err="failed to get container status \"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b\": rpc error: code = NotFound desc = could not find container \"8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b\": container with ID starting with 8ef16bdf83f7215baacc2884b918ceb37dcc28db72ed0cd07195095e98b0f63b not found: ID does not exist" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.019912 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.999193332 podStartE2EDuration="7.019879382s" podCreationTimestamp="2025-12-05 08:47:33 +0000 UTC" firstStartedPulling="2025-12-05 08:47:35.087375457 +0000 UTC m=+1406.659979196" lastFinishedPulling="2025-12-05 08:47:39.108061507 +0000 UTC m=+1410.680665246" observedRunningTime="2025-12-05 08:47:40.010219053 +0000 UTC m=+1411.582822792" watchObservedRunningTime="2025-12-05 08:47:40.019879382 +0000 UTC m=+1411.592483121" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.118583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4qmb\" (UniqueName: \"kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb\") pod \"4d227940-c98f-416a-bcfb-460e243138b3\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.118674 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data\") pod \"4d227940-c98f-416a-bcfb-460e243138b3\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.118919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle\") pod \"4d227940-c98f-416a-bcfb-460e243138b3\" (UID: \"4d227940-c98f-416a-bcfb-460e243138b3\") " Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.128725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb" (OuterVolumeSpecName: "kube-api-access-l4qmb") pod "4d227940-c98f-416a-bcfb-460e243138b3" (UID: "4d227940-c98f-416a-bcfb-460e243138b3"). InnerVolumeSpecName "kube-api-access-l4qmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.170825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d227940-c98f-416a-bcfb-460e243138b3" (UID: "4d227940-c98f-416a-bcfb-460e243138b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.188905 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data" (OuterVolumeSpecName: "config-data") pod "4d227940-c98f-416a-bcfb-460e243138b3" (UID: "4d227940-c98f-416a-bcfb-460e243138b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.221520 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.221564 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4qmb\" (UniqueName: \"kubernetes.io/projected/4d227940-c98f-416a-bcfb-460e243138b3-kube-api-access-l4qmb\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.221582 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d227940-c98f-416a-bcfb-460e243138b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.294154 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.311226 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.326307 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:40 crc kubenswrapper[4795]: E1205 08:47:40.327196 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d227940-c98f-416a-bcfb-460e243138b3" containerName="nova-scheduler-scheduler" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.327280 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d227940-c98f-416a-bcfb-460e243138b3" containerName="nova-scheduler-scheduler" Dec 05 08:47:40 crc kubenswrapper[4795]: E1205 08:47:40.327366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bcede8-6fd6-409e-83ef-306a7912dc7f" containerName="nova-manage" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.327430 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bcede8-6fd6-409e-83ef-306a7912dc7f" containerName="nova-manage" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.327851 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bcede8-6fd6-409e-83ef-306a7912dc7f" containerName="nova-manage" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.327934 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d227940-c98f-416a-bcfb-460e243138b3" containerName="nova-scheduler-scheduler" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.329306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.332956 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.351050 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.528002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.529105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.529240 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlpb8\" (UniqueName: \"kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.631133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.631707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.631772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlpb8\" (UniqueName: \"kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.637497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.637999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.652881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlpb8\" (UniqueName: \"kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8\") pod \"nova-scheduler-0\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " pod="openstack/nova-scheduler-0" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.760275 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d227940-c98f-416a-bcfb-460e243138b3" path="/var/lib/kubelet/pods/4d227940-c98f-416a-bcfb-460e243138b3/volumes" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.827876 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.828324 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:47:40 crc kubenswrapper[4795]: I1205 08:47:40.953460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:47:41 crc kubenswrapper[4795]: I1205 08:47:41.571410 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:47:41 crc kubenswrapper[4795]: I1205 08:47:41.958521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d371839f-fc5b-4287-bccf-f0077497e3e2","Type":"ContainerStarted","Data":"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f"} Dec 05 08:47:41 crc kubenswrapper[4795]: I1205 08:47:41.959152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d371839f-fc5b-4287-bccf-f0077497e3e2","Type":"ContainerStarted","Data":"beb1d8ba660111f8edac80cd30c4f87730c3b9d0b8bdc9e90308e8083fedf244"} Dec 05 08:47:41 crc kubenswrapper[4795]: I1205 08:47:41.961779 4795 generic.go:334] "Generic (PLEG): container finished" podID="4d614be9-37fb-439a-895d-eb5c92210497" containerID="10cb78d4de0eb2cead73b25d4b22b2aa00b82621ad1e9971d37f2c0171c767a1" exitCode=0 Dec 05 08:47:41 crc kubenswrapper[4795]: I1205 08:47:41.961830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" event={"ID":"4d614be9-37fb-439a-895d-eb5c92210497","Type":"ContainerDied","Data":"10cb78d4de0eb2cead73b25d4b22b2aa00b82621ad1e9971d37f2c0171c767a1"} Dec 05 08:47:42 crc kubenswrapper[4795]: I1205 08:47:42.009188 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.00914706 podStartE2EDuration="2.00914706s" podCreationTimestamp="2025-12-05 08:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:41.979559313 +0000 UTC m=+1413.552163052" watchObservedRunningTime="2025-12-05 08:47:42.00914706 +0000 UTC m=+1413.581750799" Dec 05 08:47:42 crc kubenswrapper[4795]: I1205 08:47:42.976535 4795 generic.go:334] "Generic (PLEG): container finished" podID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerID="af012a9d9693925faa3d6d13d49614b43bba27225e84b1d8228d62a2caa3e7c2" exitCode=0 Dec 05 08:47:42 crc kubenswrapper[4795]: I1205 08:47:42.976725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerDied","Data":"af012a9d9693925faa3d6d13d49614b43bba27225e84b1d8228d62a2caa3e7c2"} Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.383140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.399823 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle\") pod \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.399883 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data\") pod \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.399933 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9brjc\" (UniqueName: \"kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc\") pod \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.400040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs\") pod \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\" (UID: \"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.400572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs" (OuterVolumeSpecName: "logs") pod "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" (UID: "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.423018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc" (OuterVolumeSpecName: "kube-api-access-9brjc") pod "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" (UID: "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022"). InnerVolumeSpecName "kube-api-access-9brjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.502710 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.502753 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9brjc\" (UniqueName: \"kubernetes.io/projected/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-kube-api-access-9brjc\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.508837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" (UID: "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.521359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data" (OuterVolumeSpecName: "config-data") pod "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" (UID: "6bf1ff3f-1907-4a4c-8ae0-e028e37f1022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.562166 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.604977 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.605030 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.706720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwj2h\" (UniqueName: \"kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h\") pod \"4d614be9-37fb-439a-895d-eb5c92210497\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.706772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts\") pod \"4d614be9-37fb-439a-895d-eb5c92210497\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.706873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle\") pod \"4d614be9-37fb-439a-895d-eb5c92210497\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.707115 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data\") pod \"4d614be9-37fb-439a-895d-eb5c92210497\" (UID: \"4d614be9-37fb-439a-895d-eb5c92210497\") " Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.711125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h" (OuterVolumeSpecName: "kube-api-access-xwj2h") pod "4d614be9-37fb-439a-895d-eb5c92210497" (UID: "4d614be9-37fb-439a-895d-eb5c92210497"). InnerVolumeSpecName "kube-api-access-xwj2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.713065 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts" (OuterVolumeSpecName: "scripts") pod "4d614be9-37fb-439a-895d-eb5c92210497" (UID: "4d614be9-37fb-439a-895d-eb5c92210497"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.736753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d614be9-37fb-439a-895d-eb5c92210497" (UID: "4d614be9-37fb-439a-895d-eb5c92210497"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.753030 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data" (OuterVolumeSpecName: "config-data") pod "4d614be9-37fb-439a-895d-eb5c92210497" (UID: "4d614be9-37fb-439a-895d-eb5c92210497"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.811294 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.811339 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.811353 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwj2h\" (UniqueName: \"kubernetes.io/projected/4d614be9-37fb-439a-895d-eb5c92210497-kube-api-access-xwj2h\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.811363 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d614be9-37fb-439a-895d-eb5c92210497-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.991083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" event={"ID":"4d614be9-37fb-439a-895d-eb5c92210497","Type":"ContainerDied","Data":"c00d37689d33b77d8c5b8c6920a4f54ce3fded9759f93f6432d3be40a0b9f224"} Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.991137 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-4qr2p" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.991171 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00d37689d33b77d8c5b8c6920a4f54ce3fded9759f93f6432d3be40a0b9f224" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.996157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6bf1ff3f-1907-4a4c-8ae0-e028e37f1022","Type":"ContainerDied","Data":"a08e2453fa24be7481f5c1c8b0149ae6d3324c812164aeab6db6930050f69f5a"} Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.996243 4795 scope.go:117] "RemoveContainer" containerID="af012a9d9693925faa3d6d13d49614b43bba27225e84b1d8228d62a2caa3e7c2" Dec 05 08:47:43 crc kubenswrapper[4795]: I1205 08:47:43.996386 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.058935 4795 scope.go:117] "RemoveContainer" containerID="66b8d8120f4f47fd7f2d92e33996197c4f39417fa0f45351a5c59bafddbbe476" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.079538 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.099257 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.118718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: E1205 08:47:44.119335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d614be9-37fb-439a-895d-eb5c92210497" containerName="nova-cell1-conductor-db-sync" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119360 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d614be9-37fb-439a-895d-eb5c92210497" containerName="nova-cell1-conductor-db-sync" Dec 05 08:47:44 crc kubenswrapper[4795]: E1205 08:47:44.119385 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119392 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" Dec 05 08:47:44 crc kubenswrapper[4795]: E1205 08:47:44.119419 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119426 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119676 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d614be9-37fb-439a-895d-eb5c92210497" containerName="nova-cell1-conductor-db-sync" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119698 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-log" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.119706 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" containerName="nova-api-api" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.120926 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.126745 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.132734 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.137634 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.139314 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.141430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.182492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttdl\" (UniqueName: \"kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225344 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225410 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nqj\" (UniqueName: \"kubernetes.io/projected/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-kube-api-access-p2nqj\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.225530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327771 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nqj\" (UniqueName: \"kubernetes.io/projected/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-kube-api-access-p2nqj\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.327978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nttdl\" (UniqueName: \"kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.328018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.328665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.348087 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.357859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.360952 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.375816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.392452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nqj\" (UniqueName: \"kubernetes.io/projected/6d31dbcf-b637-4b29-a68c-0b8f4226caf5-kube-api-access-p2nqj\") pod \"nova-cell1-conductor-0\" (UID: \"6d31dbcf-b637-4b29-a68c-0b8f4226caf5\") " pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.395445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttdl\" (UniqueName: \"kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl\") pod \"nova-api-0\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.482383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.501841 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:44 crc kubenswrapper[4795]: I1205 08:47:44.774671 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf1ff3f-1907-4a4c-8ae0-e028e37f1022" path="/var/lib/kubelet/pods/6bf1ff3f-1907-4a4c-8ae0-e028e37f1022/volumes" Dec 05 08:47:45 crc kubenswrapper[4795]: I1205 08:47:45.071121 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:47:45 crc kubenswrapper[4795]: I1205 08:47:45.165866 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 05 08:47:45 crc kubenswrapper[4795]: W1205 08:47:45.177416 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d31dbcf_b637_4b29_a68c_0b8f4226caf5.slice/crio-de1a0cfc41cd2361e8c47fa9725741ab076f3f8b07ba3b10d271789a0da187ea WatchSource:0}: Error finding container de1a0cfc41cd2361e8c47fa9725741ab076f3f8b07ba3b10d271789a0da187ea: Status 404 returned error can't find the container with id de1a0cfc41cd2361e8c47fa9725741ab076f3f8b07ba3b10d271789a0da187ea Dec 05 08:47:45 crc kubenswrapper[4795]: I1205 08:47:45.954452 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.027649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d31dbcf-b637-4b29-a68c-0b8f4226caf5","Type":"ContainerStarted","Data":"0c1301152575b8b20453fe10cbfb75920f4f47af03e3a9d8a514645398854b08"} Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.027718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d31dbcf-b637-4b29-a68c-0b8f4226caf5","Type":"ContainerStarted","Data":"de1a0cfc41cd2361e8c47fa9725741ab076f3f8b07ba3b10d271789a0da187ea"} Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.029292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.034298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerStarted","Data":"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc"} Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.034335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerStarted","Data":"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b"} Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.034348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerStarted","Data":"a0c9c69c61c2142f2a082773ef425e97c687e2b86a498d02aad0942e7db15e06"} Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.113786 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.1137518220000002 podStartE2EDuration="2.113751822s" podCreationTimestamp="2025-12-05 08:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:46.106131227 +0000 UTC m=+1417.678734976" watchObservedRunningTime="2025-12-05 08:47:46.113751822 +0000 UTC m=+1417.686355561" Dec 05 08:47:46 crc kubenswrapper[4795]: I1205 08:47:46.120209 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.120188556 podStartE2EDuration="2.120188556s" podCreationTimestamp="2025-12-05 08:47:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:46.066978301 +0000 UTC m=+1417.639582050" watchObservedRunningTime="2025-12-05 08:47:46.120188556 +0000 UTC m=+1417.692792295" Dec 05 08:47:50 crc kubenswrapper[4795]: I1205 08:47:50.954820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.005990 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.089742 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4" exitCode=137 Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.089781 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4"} Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.090314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965"} Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.090415 4795 scope.go:117] "RemoveContainer" containerID="c588144102533680a34a5726505baa9bf35c19577cb0770b52dbb674df3a4575" Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.159670 4795 generic.go:334] "Generic (PLEG): container finished" podID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerID="a84791e2ae7b68c09f8fd75787a1361ec4ab9189478ca4102f37be6973b89990" exitCode=137 Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.160114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerDied","Data":"a84791e2ae7b68c09f8fd75787a1361ec4ab9189478ca4102f37be6973b89990"} Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.160213 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57b485fdb4-h9cjs" event={"ID":"f89d9173-0065-4beb-a1b6-ba7be5094a58","Type":"ContainerStarted","Data":"d0fbd50543c811cd861c6c419ab7fbcf8efda7e35b37602e38ed27b03a616cb5"} Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.201355 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 08:47:51 crc kubenswrapper[4795]: I1205 08:47:51.333588 4795 scope.go:117] "RemoveContainer" containerID="6048f959ec29f10a78dde1c3f2cbc14da8e357cd096b40fc95865c55fc1eeb57" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.188697 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.212279 4795 generic.go:334] "Generic (PLEG): container finished" podID="f8713212-260c-47e9-ace2-16535a611c7b" containerID="e0fa72de53dc1ad92f9cb5cb6285c67be3685979872e0395ea1406815604c7ea" exitCode=137 Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.212470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerDied","Data":"e0fa72de53dc1ad92f9cb5cb6285c67be3685979872e0395ea1406815604c7ea"} Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.212525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f8713212-260c-47e9-ace2-16535a611c7b","Type":"ContainerDied","Data":"3debae000473f101e626e41739d4f713cf91759920c22f89fa6dff482b3b5cfa"} Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.212539 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3debae000473f101e626e41739d4f713cf91759920c22f89fa6dff482b3b5cfa" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.217334 4795 generic.go:334] "Generic (PLEG): container finished" podID="6060b4e3-f743-45b7-b401-0e01950aadd8" containerID="7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f" exitCode=137 Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.217382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6060b4e3-f743-45b7-b401-0e01950aadd8","Type":"ContainerDied","Data":"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f"} Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.217415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6060b4e3-f743-45b7-b401-0e01950aadd8","Type":"ContainerDied","Data":"33b44d9d0fd56540faf7b3066eef06e8c19985c479bb234316828de447974aff"} Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.217435 4795 scope.go:117] "RemoveContainer" containerID="7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.217594 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.228445 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.269419 4795 scope.go:117] "RemoveContainer" containerID="7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f" Dec 05 08:47:54 crc kubenswrapper[4795]: E1205 08:47:54.272756 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f\": container with ID starting with 7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f not found: ID does not exist" containerID="7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.272816 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f"} err="failed to get container status \"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f\": rpc error: code = NotFound desc = could not find container \"7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f\": container with ID starting with 7b5bef4c4eaec0ea60c2b3427c9d0d8520ca85faa9b628d70e35f52fdb5bb65f not found: ID does not exist" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.287972 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p94bf\" (UniqueName: \"kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf\") pod \"f8713212-260c-47e9-ace2-16535a611c7b\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle\") pod \"f8713212-260c-47e9-ace2-16535a611c7b\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mccvm\" (UniqueName: \"kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm\") pod \"6060b4e3-f743-45b7-b401-0e01950aadd8\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288204 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle\") pod \"6060b4e3-f743-45b7-b401-0e01950aadd8\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs\") pod \"f8713212-260c-47e9-ace2-16535a611c7b\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data\") pod \"f8713212-260c-47e9-ace2-16535a611c7b\" (UID: \"f8713212-260c-47e9-ace2-16535a611c7b\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.288489 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data\") pod \"6060b4e3-f743-45b7-b401-0e01950aadd8\" (UID: \"6060b4e3-f743-45b7-b401-0e01950aadd8\") " Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.294761 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs" (OuterVolumeSpecName: "logs") pod "f8713212-260c-47e9-ace2-16535a611c7b" (UID: "f8713212-260c-47e9-ace2-16535a611c7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.321427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf" (OuterVolumeSpecName: "kube-api-access-p94bf") pod "f8713212-260c-47e9-ace2-16535a611c7b" (UID: "f8713212-260c-47e9-ace2-16535a611c7b"). InnerVolumeSpecName "kube-api-access-p94bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.353468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8713212-260c-47e9-ace2-16535a611c7b" (UID: "f8713212-260c-47e9-ace2-16535a611c7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.354076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm" (OuterVolumeSpecName: "kube-api-access-mccvm") pod "6060b4e3-f743-45b7-b401-0e01950aadd8" (UID: "6060b4e3-f743-45b7-b401-0e01950aadd8"). InnerVolumeSpecName "kube-api-access-mccvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.370157 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6060b4e3-f743-45b7-b401-0e01950aadd8" (UID: "6060b4e3-f743-45b7-b401-0e01950aadd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.375025 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data" (OuterVolumeSpecName: "config-data") pod "f8713212-260c-47e9-ace2-16535a611c7b" (UID: "f8713212-260c-47e9-ace2-16535a611c7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.388773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data" (OuterVolumeSpecName: "config-data") pod "6060b4e3-f743-45b7-b401-0e01950aadd8" (UID: "6060b4e3-f743-45b7-b401-0e01950aadd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394246 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mccvm\" (UniqueName: \"kubernetes.io/projected/6060b4e3-f743-45b7-b401-0e01950aadd8-kube-api-access-mccvm\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394284 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394294 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8713212-260c-47e9-ace2-16535a611c7b-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394327 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394337 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6060b4e3-f743-45b7-b401-0e01950aadd8-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394349 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p94bf\" (UniqueName: \"kubernetes.io/projected/f8713212-260c-47e9-ace2-16535a611c7b-kube-api-access-p94bf\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.394357 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8713212-260c-47e9-ace2-16535a611c7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.482860 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.483283 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.549662 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.565552 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.573932 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.604439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:54 crc kubenswrapper[4795]: E1205 08:47:54.605097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6060b4e3-f743-45b7-b401-0e01950aadd8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605133 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6060b4e3-f743-45b7-b401-0e01950aadd8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 08:47:54 crc kubenswrapper[4795]: E1205 08:47:54.605188 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-metadata" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605198 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-metadata" Dec 05 08:47:54 crc kubenswrapper[4795]: E1205 08:47:54.605215 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-log" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605224 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-log" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605453 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6060b4e3-f743-45b7-b401-0e01950aadd8" containerName="nova-cell1-novncproxy-novncproxy" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605481 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-log" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.605499 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8713212-260c-47e9-ace2-16535a611c7b" containerName="nova-metadata-metadata" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.612184 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.615981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.616197 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.616320 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.640324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.705472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.705647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.705686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.705726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npjll\" (UniqueName: \"kubernetes.io/projected/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-kube-api-access-npjll\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.705750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.788965 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6060b4e3-f743-45b7-b401-0e01950aadd8" path="/var/lib/kubelet/pods/6060b4e3-f743-45b7-b401-0e01950aadd8/volumes" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.809210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.809849 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.810295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npjll\" (UniqueName: \"kubernetes.io/projected/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-kube-api-access-npjll\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.810365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.810439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.827941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.828020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.842018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.846348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.854508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npjll\" (UniqueName: \"kubernetes.io/projected/a8fc111b-5e38-4955-a1ec-dbd8e155fd2f-kube-api-access-npjll\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:54 crc kubenswrapper[4795]: I1205 08:47:54.938888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.237092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.280284 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.314525 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.325660 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.327671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.330762 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.331367 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.336794 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.460173 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.460238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.460340 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.460422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.460468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksc9n\" (UniqueName: \"kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.562624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.562676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.562719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.562775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.562804 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksc9n\" (UniqueName: \"kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.565104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.565934 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.565962 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.567056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.573758 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.578160 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.581141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.583345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksc9n\" (UniqueName: \"kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n\") pod \"nova-metadata-0\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " pod="openstack/nova-metadata-0" Dec 05 08:47:55 crc kubenswrapper[4795]: W1205 08:47:55.590641 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8fc111b_5e38_4955_a1ec_dbd8e155fd2f.slice/crio-aa64a96b0006df694c8f55d00bc64a4f8d39618ede132df7a20a7e62ef62f6e9 WatchSource:0}: Error finding container aa64a96b0006df694c8f55d00bc64a4f8d39618ede132df7a20a7e62ef62f6e9: Status 404 returned error can't find the container with id aa64a96b0006df694c8f55d00bc64a4f8d39618ede132df7a20a7e62ef62f6e9 Dec 05 08:47:55 crc kubenswrapper[4795]: I1205 08:47:55.687977 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:47:56 crc kubenswrapper[4795]: I1205 08:47:56.250110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f","Type":"ContainerStarted","Data":"aa64a96b0006df694c8f55d00bc64a4f8d39618ede132df7a20a7e62ef62f6e9"} Dec 05 08:47:56 crc kubenswrapper[4795]: I1205 08:47:56.324161 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:47:56 crc kubenswrapper[4795]: I1205 08:47:56.760376 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8713212-260c-47e9-ace2-16535a611c7b" path="/var/lib/kubelet/pods/f8713212-260c-47e9-ace2-16535a611c7b/volumes" Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.265536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8fc111b-5e38-4955-a1ec-dbd8e155fd2f","Type":"ContainerStarted","Data":"a4d51191bac05b2db05b46ae722d6dd56c060fc7d262031c85dc4b963cb3e295"} Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.279330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerStarted","Data":"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b"} Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.279378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerStarted","Data":"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa"} Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.279391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerStarted","Data":"532db7dda5b4de7735ed801ca6827b756def688b709de2d6d63c97e2f8e57cc1"} Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.295231 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.295203192 podStartE2EDuration="3.295203192s" podCreationTimestamp="2025-12-05 08:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:57.284588305 +0000 UTC m=+1428.857192044" watchObservedRunningTime="2025-12-05 08:47:57.295203192 +0000 UTC m=+1428.867806931" Dec 05 08:47:57 crc kubenswrapper[4795]: I1205 08:47:57.318017 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.31799162 podStartE2EDuration="2.31799162s" podCreationTimestamp="2025-12-05 08:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:47:57.311923836 +0000 UTC m=+1428.884527575" watchObservedRunningTime="2025-12-05 08:47:57.31799162 +0000 UTC m=+1428.890595359" Dec 05 08:47:59 crc kubenswrapper[4795]: I1205 08:47:59.939166 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.032524 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.032758 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.034532 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.359024 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.359082 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.362251 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.690035 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:48:00 crc kubenswrapper[4795]: I1205 08:48:00.691605 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.315553 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.489476 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.489573 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.490294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.490368 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.501730 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.503039 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.801536 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.803729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.818200 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.939635 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949605 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgv49\" (UniqueName: \"kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.949978 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:04 crc kubenswrapper[4795]: I1205 08:48:04.978163 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.051929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.052082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.052117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgv49\" (UniqueName: \"kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.052215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.052257 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.052320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.053583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.053730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.053853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.053902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.054483 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.075862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgv49\" (UniqueName: \"kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49\") pod \"dnsmasq-dns-cd5cbd7b9-tbd8b\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.137839 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.510119 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.691231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.691311 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.761215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.838766 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bml8k"] Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.841918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.849407 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.850022 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.933942 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bml8k"] Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.988096 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.988171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.988222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rglx2\" (UniqueName: \"kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:05 crc kubenswrapper[4795]: I1205 08:48:05.988248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.091011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.091128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.091158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rglx2\" (UniqueName: \"kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.092021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.100599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.100963 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.105297 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.114342 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rglx2\" (UniqueName: \"kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2\") pod \"nova-cell1-cell-mapping-bml8k\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.174406 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.455179 4795 generic.go:334] "Generic (PLEG): container finished" podID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerID="c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235" exitCode=0 Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.455934 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" event={"ID":"d15fba4a-f47b-4143-ba07-6d368e19f33f","Type":"ContainerDied","Data":"c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235"} Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.455971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" event={"ID":"d15fba4a-f47b-4143-ba07-6d368e19f33f","Type":"ContainerStarted","Data":"323195fb26c1f87a425aad14432b7b2af7fc3a49134602cae4b250e05d04550e"} Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.752811 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.753502 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:06 crc kubenswrapper[4795]: I1205 08:48:06.892008 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bml8k"] Dec 05 08:48:07 crc kubenswrapper[4795]: I1205 08:48:07.474356 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" event={"ID":"d15fba4a-f47b-4143-ba07-6d368e19f33f","Type":"ContainerStarted","Data":"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e"} Dec 05 08:48:07 crc kubenswrapper[4795]: I1205 08:48:07.487078 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:07 crc kubenswrapper[4795]: I1205 08:48:07.487222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bml8k" event={"ID":"22800d9f-49d9-4d82-b097-0e3e52a3d311","Type":"ContainerStarted","Data":"3d09817b423c83115fc9d7d2accc81bcbc9a252ef25ea40944df638f9a422b69"} Dec 05 08:48:07 crc kubenswrapper[4795]: I1205 08:48:07.487305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bml8k" event={"ID":"22800d9f-49d9-4d82-b097-0e3e52a3d311","Type":"ContainerStarted","Data":"a123e30c14a610f2f43837f3e53c0b52bb630b4edf4c9d18ca1c57b3a80243df"} Dec 05 08:48:07 crc kubenswrapper[4795]: I1205 08:48:07.516705 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" podStartSLOduration=3.516661764 podStartE2EDuration="3.516661764s" podCreationTimestamp="2025-12-05 08:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:07.512518801 +0000 UTC m=+1439.085122550" watchObservedRunningTime="2025-12-05 08:48:07.516661764 +0000 UTC m=+1439.089265503" Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.223279 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bml8k" podStartSLOduration=3.223252924 podStartE2EDuration="3.223252924s" podCreationTimestamp="2025-12-05 08:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:07.545713101 +0000 UTC m=+1439.118316840" watchObservedRunningTime="2025-12-05 08:48:08.223252924 +0000 UTC m=+1439.795856663" Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.230524 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.230899 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-log" containerID="cri-o://838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b" gracePeriod=30 Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.231069 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-api" containerID="cri-o://9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc" gracePeriod=30 Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.499251 4795 generic.go:334] "Generic (PLEG): container finished" podID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerID="838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b" exitCode=143 Dec 05 08:48:08 crc kubenswrapper[4795]: I1205 08:48:08.500459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerDied","Data":"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b"} Dec 05 08:48:10 crc kubenswrapper[4795]: I1205 08:48:10.033459 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 05 08:48:10 crc kubenswrapper[4795]: I1205 08:48:10.359213 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57b485fdb4-h9cjs" podUID="f89d9173-0065-4beb-a1b6-ba7be5094a58" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 05 08:48:10 crc kubenswrapper[4795]: I1205 08:48:10.827041 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:48:10 crc kubenswrapper[4795]: I1205 08:48:10.827551 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.272744 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.273503 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-central-agent" containerID="cri-o://6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946" gracePeriod=30 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.274884 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="proxy-httpd" containerID="cri-o://861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f" gracePeriod=30 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.275322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-notification-agent" containerID="cri-o://2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7" gracePeriod=30 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.275408 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="sg-core" containerID="cri-o://a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f" gracePeriod=30 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.309977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.408472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle\") pod \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.408584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs\") pod \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.408697 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data\") pod \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.408752 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nttdl\" (UniqueName: \"kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl\") pod \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\" (UID: \"0feed6b1-c0cf-47f4-bdf0-da2147628dfb\") " Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.409513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs" (OuterVolumeSpecName: "logs") pod "0feed6b1-c0cf-47f4-bdf0-da2147628dfb" (UID: "0feed6b1-c0cf-47f4-bdf0-da2147628dfb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.418601 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl" (OuterVolumeSpecName: "kube-api-access-nttdl") pod "0feed6b1-c0cf-47f4-bdf0-da2147628dfb" (UID: "0feed6b1-c0cf-47f4-bdf0-da2147628dfb"). InnerVolumeSpecName "kube-api-access-nttdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.452802 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0feed6b1-c0cf-47f4-bdf0-da2147628dfb" (UID: "0feed6b1-c0cf-47f4-bdf0-da2147628dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.471843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data" (OuterVolumeSpecName: "config-data") pod "0feed6b1-c0cf-47f4-bdf0-da2147628dfb" (UID: "0feed6b1-c0cf-47f4-bdf0-da2147628dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.511324 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.511372 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.511384 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.511394 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nttdl\" (UniqueName: \"kubernetes.io/projected/0feed6b1-c0cf-47f4-bdf0-da2147628dfb-kube-api-access-nttdl\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.563719 4795 generic.go:334] "Generic (PLEG): container finished" podID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerID="a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f" exitCode=2 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.563820 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerDied","Data":"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f"} Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.570828 4795 generic.go:334] "Generic (PLEG): container finished" podID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerID="9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc" exitCode=0 Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.570875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerDied","Data":"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc"} Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.570924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0feed6b1-c0cf-47f4-bdf0-da2147628dfb","Type":"ContainerDied","Data":"a0c9c69c61c2142f2a082773ef425e97c687e2b86a498d02aad0942e7db15e06"} Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.570944 4795 scope.go:117] "RemoveContainer" containerID="9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.571168 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.629329 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.637977 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.677117 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:12 crc kubenswrapper[4795]: E1205 08:48:12.677695 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-api" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.677711 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-api" Dec 05 08:48:12 crc kubenswrapper[4795]: E1205 08:48:12.677740 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-log" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.677747 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-log" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.677957 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-api" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.677980 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" containerName="nova-api-log" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.681402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.686341 4795 scope.go:117] "RemoveContainer" containerID="838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.688007 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.688316 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.688520 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.709937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.767273 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0feed6b1-c0cf-47f4-bdf0-da2147628dfb" path="/var/lib/kubelet/pods/0feed6b1-c0cf-47f4-bdf0-da2147628dfb/volumes" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bbj\" (UniqueName: \"kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824395 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.824418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.826085 4795 scope.go:117] "RemoveContainer" containerID="9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc" Dec 05 08:48:12 crc kubenswrapper[4795]: E1205 08:48:12.827755 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc\": container with ID starting with 9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc not found: ID does not exist" containerID="9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.827795 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc"} err="failed to get container status \"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc\": rpc error: code = NotFound desc = could not find container \"9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc\": container with ID starting with 9b1ae1890fe8e515d6b01038b1c2cb8dce055ebc6a8afd648736aa9ab693c0dc not found: ID does not exist" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.827826 4795 scope.go:117] "RemoveContainer" containerID="838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b" Dec 05 08:48:12 crc kubenswrapper[4795]: E1205 08:48:12.840125 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b\": container with ID starting with 838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b not found: ID does not exist" containerID="838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.840200 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b"} err="failed to get container status \"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b\": rpc error: code = NotFound desc = could not find container \"838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b\": container with ID starting with 838b3b5006f7a4c3a854e5196ce2e95c447ea6f41c8507400c943106462ad32b not found: ID does not exist" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.926283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.926841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.926947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.926981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bbj\" (UniqueName: \"kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.927036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.927058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.927541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.932588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.933503 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.935596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.936125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:12 crc kubenswrapper[4795]: I1205 08:48:12.964225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bbj\" (UniqueName: \"kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj\") pod \"nova-api-0\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " pod="openstack/nova-api-0" Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.132586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.597110 4795 generic.go:334] "Generic (PLEG): container finished" podID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerID="861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f" exitCode=0 Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.597479 4795 generic.go:334] "Generic (PLEG): container finished" podID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerID="6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946" exitCode=0 Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.597521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerDied","Data":"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f"} Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.597562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerDied","Data":"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946"} Dec 05 08:48:13 crc kubenswrapper[4795]: I1205 08:48:13.790635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:13 crc kubenswrapper[4795]: W1205 08:48:13.793586 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode091d7a9_ce0f_4a98_ac4d_fed812949fad.slice/crio-a0adc212a7ec27726b28ea791548911fd3fa23f82eebebe6610b34b6faa2aa39 WatchSource:0}: Error finding container a0adc212a7ec27726b28ea791548911fd3fa23f82eebebe6610b34b6faa2aa39: Status 404 returned error can't find the container with id a0adc212a7ec27726b28ea791548911fd3fa23f82eebebe6610b34b6faa2aa39 Dec 05 08:48:14 crc kubenswrapper[4795]: I1205 08:48:14.619533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerStarted","Data":"5542c7b9f9766535f97696e38456699fce8bc0654b59fd9e4aede52428c46d30"} Dec 05 08:48:14 crc kubenswrapper[4795]: I1205 08:48:14.619899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerStarted","Data":"cd73f97dce42ef39cdf6db7475fafdb6d2970e0666b8ffe52676cd6f819eb7cb"} Dec 05 08:48:14 crc kubenswrapper[4795]: I1205 08:48:14.619912 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerStarted","Data":"a0adc212a7ec27726b28ea791548911fd3fa23f82eebebe6610b34b6faa2aa39"} Dec 05 08:48:14 crc kubenswrapper[4795]: I1205 08:48:14.655231 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.655193558 podStartE2EDuration="2.655193558s" podCreationTimestamp="2025-12-05 08:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:14.649163915 +0000 UTC m=+1446.221767654" watchObservedRunningTime="2025-12-05 08:48:14.655193558 +0000 UTC m=+1446.227797297" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.140370 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.227961 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.228362 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="dnsmasq-dns" containerID="cri-o://fb51cdfd590896e959394cafa5039139a64c7e35c084178852d0332dad2669e1" gracePeriod=10 Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.643532 4795 generic.go:334] "Generic (PLEG): container finished" podID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerID="fb51cdfd590896e959394cafa5039139a64c7e35c084178852d0332dad2669e1" exitCode=0 Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.644791 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" event={"ID":"244cc672-47b5-4bdf-9466-f2f0f7c73e8f","Type":"ContainerDied","Data":"fb51cdfd590896e959394cafa5039139a64c7e35c084178852d0332dad2669e1"} Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.706323 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.706817 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.731025 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.732763 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 08:48:15 crc kubenswrapper[4795]: I1205 08:48:15.987467 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044011 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044379 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044484 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbt62\" (UniqueName: \"kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.044794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc\") pod \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\" (UID: \"244cc672-47b5-4bdf-9466-f2f0f7c73e8f\") " Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.056427 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62" (OuterVolumeSpecName: "kube-api-access-wbt62") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "kube-api-access-wbt62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.147871 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbt62\" (UniqueName: \"kubernetes.io/projected/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-kube-api-access-wbt62\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.194778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.205241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.216801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.220032 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.224280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config" (OuterVolumeSpecName: "config") pod "244cc672-47b5-4bdf-9466-f2f0f7c73e8f" (UID: "244cc672-47b5-4bdf-9466-f2f0f7c73e8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.250287 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.250341 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.250356 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.250403 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.250413 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/244cc672-47b5-4bdf-9466-f2f0f7c73e8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.655996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" event={"ID":"244cc672-47b5-4bdf-9466-f2f0f7c73e8f","Type":"ContainerDied","Data":"d595575fc4ae0fffc080659e83e924a00f3208aedf13b3e7a7c81e8e296c96e2"} Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.656025 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-ltkvs" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.656119 4795 scope.go:117] "RemoveContainer" containerID="fb51cdfd590896e959394cafa5039139a64c7e35c084178852d0332dad2669e1" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.658288 4795 generic.go:334] "Generic (PLEG): container finished" podID="22800d9f-49d9-4d82-b097-0e3e52a3d311" containerID="3d09817b423c83115fc9d7d2accc81bcbc9a252ef25ea40944df638f9a422b69" exitCode=0 Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.658578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bml8k" event={"ID":"22800d9f-49d9-4d82-b097-0e3e52a3d311","Type":"ContainerDied","Data":"3d09817b423c83115fc9d7d2accc81bcbc9a252ef25ea40944df638f9a422b69"} Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.687727 4795 scope.go:117] "RemoveContainer" containerID="1e16a0455b8e7f7004d24d7f988a319f680124b57bcf502e280e053f4f21b33a" Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.729696 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.742334 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-ltkvs"] Dec 05 08:48:16 crc kubenswrapper[4795]: I1205 08:48:16.763086 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" path="/var/lib/kubelet/pods/244cc672-47b5-4bdf-9466-f2f0f7c73e8f/volumes" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.308233 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.391907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqgmb\" (UniqueName: \"kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392019 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.392312 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.393270 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.393322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.398951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts" (OuterVolumeSpecName: "scripts") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.407488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb" (OuterVolumeSpecName: "kube-api-access-cqgmb") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "kube-api-access-cqgmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.494851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.495981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") pod \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\" (UID: \"837f4be0-582b-41a0-92c3-7c8ad1aecd0e\") " Dec 05 08:48:17 crc kubenswrapper[4795]: W1205 08:48:17.496290 4795 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/837f4be0-582b-41a0-92c3-7c8ad1aecd0e/volumes/kubernetes.io~secret/sg-core-conf-yaml Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.496643 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.497817 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqgmb\" (UniqueName: \"kubernetes.io/projected/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-kube-api-access-cqgmb\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.497950 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.498050 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.498151 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.498243 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.530073 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.585928 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data" (OuterVolumeSpecName: "config-data") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.600488 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.600535 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.650110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "837f4be0-582b-41a0-92c3-7c8ad1aecd0e" (UID: "837f4be0-582b-41a0-92c3-7c8ad1aecd0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.702532 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837f4be0-582b-41a0-92c3-7c8ad1aecd0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.705842 4795 generic.go:334] "Generic (PLEG): container finished" podID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerID="2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7" exitCode=0 Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.706349 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.707212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerDied","Data":"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7"} Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.707322 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"837f4be0-582b-41a0-92c3-7c8ad1aecd0e","Type":"ContainerDied","Data":"cc2b5ce7a2f3f148d185f794b4a65a98bb7420c7bc8ab4032294a0ae7a783900"} Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.707450 4795 scope.go:117] "RemoveContainer" containerID="861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.790693 4795 scope.go:117] "RemoveContainer" containerID="a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.797280 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.820109 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.840871 4795 scope.go:117] "RemoveContainer" containerID="2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.863078 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864199 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="init" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864230 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="init" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-notification-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864272 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-notification-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864301 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="sg-core" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864308 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="sg-core" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864317 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-central-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-central-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864338 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="dnsmasq-dns" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864345 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="dnsmasq-dns" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.864353 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="proxy-httpd" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864360 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="proxy-httpd" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864594 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="244cc672-47b5-4bdf-9466-f2f0f7c73e8f" containerName="dnsmasq-dns" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864684 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="sg-core" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864694 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-central-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="proxy-httpd" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.864720 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" containerName="ceilometer-notification-agent" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.869128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.873381 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.875218 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.875902 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.876949 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.920026 4795 scope.go:117] "RemoveContainer" containerID="6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922137 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922239 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-run-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-scripts\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqw7q\" (UniqueName: \"kubernetes.io/projected/aee9ade7-8fe6-4548-aa94-032d421ac9ab-kube-api-access-tqw7q\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922442 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-log-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.922510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-config-data\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.955110 4795 scope.go:117] "RemoveContainer" containerID="861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.956267 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f\": container with ID starting with 861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f not found: ID does not exist" containerID="861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.956336 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f"} err="failed to get container status \"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f\": rpc error: code = NotFound desc = could not find container \"861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f\": container with ID starting with 861764af39708b6ee1d8b72b747fa3bb8d7bfec1a90ab3f2ed582955f9be139f not found: ID does not exist" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.956373 4795 scope.go:117] "RemoveContainer" containerID="a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.957136 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f\": container with ID starting with a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f not found: ID does not exist" containerID="a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.957192 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f"} err="failed to get container status \"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f\": rpc error: code = NotFound desc = could not find container \"a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f\": container with ID starting with a2824c3ffd7e1a9b4c84da3ebe9fa5583dc24a070c7c926fedb39de57b10334f not found: ID does not exist" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.957232 4795 scope.go:117] "RemoveContainer" containerID="2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.957720 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7\": container with ID starting with 2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7 not found: ID does not exist" containerID="2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.957864 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7"} err="failed to get container status \"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7\": rpc error: code = NotFound desc = could not find container \"2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7\": container with ID starting with 2571a4d8b706f859c1923e5c532840a31890889325c69e80fdd6ea9980e2bea7 not found: ID does not exist" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.957981 4795 scope.go:117] "RemoveContainer" containerID="6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946" Dec 05 08:48:17 crc kubenswrapper[4795]: E1205 08:48:17.958373 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946\": container with ID starting with 6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946 not found: ID does not exist" containerID="6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946" Dec 05 08:48:17 crc kubenswrapper[4795]: I1205 08:48:17.958483 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946"} err="failed to get container status \"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946\": rpc error: code = NotFound desc = could not find container \"6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946\": container with ID starting with 6f2a733ae294763fe602f6ef82bf32eb5a105d9996145579dbc0dfe548511946 not found: ID does not exist" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.024736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.024868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-run-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.024952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-scripts\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.025021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqw7q\" (UniqueName: \"kubernetes.io/projected/aee9ade7-8fe6-4548-aa94-032d421ac9ab-kube-api-access-tqw7q\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.025048 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.025103 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-log-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.025139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-config-data\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.025208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.030815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-log-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.030990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aee9ade7-8fe6-4548-aa94-032d421ac9ab-run-httpd\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.048832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-config-data\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.052013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.053348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqw7q\" (UniqueName: \"kubernetes.io/projected/aee9ade7-8fe6-4548-aa94-032d421ac9ab-kube-api-access-tqw7q\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.057413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-scripts\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.057855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.065176 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee9ade7-8fe6-4548-aa94-032d421ac9ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aee9ade7-8fe6-4548-aa94-032d421ac9ab\") " pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.146385 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.228409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.235108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts\") pod \"22800d9f-49d9-4d82-b097-0e3e52a3d311\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.235175 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data\") pod \"22800d9f-49d9-4d82-b097-0e3e52a3d311\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.235347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle\") pod \"22800d9f-49d9-4d82-b097-0e3e52a3d311\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.235488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rglx2\" (UniqueName: \"kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2\") pod \"22800d9f-49d9-4d82-b097-0e3e52a3d311\" (UID: \"22800d9f-49d9-4d82-b097-0e3e52a3d311\") " Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.250167 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2" (OuterVolumeSpecName: "kube-api-access-rglx2") pod "22800d9f-49d9-4d82-b097-0e3e52a3d311" (UID: "22800d9f-49d9-4d82-b097-0e3e52a3d311"). InnerVolumeSpecName "kube-api-access-rglx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.274816 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts" (OuterVolumeSpecName: "scripts") pod "22800d9f-49d9-4d82-b097-0e3e52a3d311" (UID: "22800d9f-49d9-4d82-b097-0e3e52a3d311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.276540 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22800d9f-49d9-4d82-b097-0e3e52a3d311" (UID: "22800d9f-49d9-4d82-b097-0e3e52a3d311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.299156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data" (OuterVolumeSpecName: "config-data") pod "22800d9f-49d9-4d82-b097-0e3e52a3d311" (UID: "22800d9f-49d9-4d82-b097-0e3e52a3d311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.338251 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.338293 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rglx2\" (UniqueName: \"kubernetes.io/projected/22800d9f-49d9-4d82-b097-0e3e52a3d311-kube-api-access-rglx2\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.338309 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.338318 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22800d9f-49d9-4d82-b097-0e3e52a3d311-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.719189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bml8k" event={"ID":"22800d9f-49d9-4d82-b097-0e3e52a3d311","Type":"ContainerDied","Data":"a123e30c14a610f2f43837f3e53c0b52bb630b4edf4c9d18ca1c57b3a80243df"} Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.719833 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a123e30c14a610f2f43837f3e53c0b52bb630b4edf4c9d18ca1c57b3a80243df" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.719917 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bml8k" Dec 05 08:48:18 crc kubenswrapper[4795]: W1205 08:48:18.754426 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee9ade7_8fe6_4548_aa94_032d421ac9ab.slice/crio-4bfb923c29008ef37e3756f02c75756ac73a7cefa39b37f610337743d589d421 WatchSource:0}: Error finding container 4bfb923c29008ef37e3756f02c75756ac73a7cefa39b37f610337743d589d421: Status 404 returned error can't find the container with id 4bfb923c29008ef37e3756f02c75756ac73a7cefa39b37f610337743d589d421 Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.772170 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837f4be0-582b-41a0-92c3-7c8ad1aecd0e" path="/var/lib/kubelet/pods/837f4be0-582b-41a0-92c3-7c8ad1aecd0e/volumes" Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.773269 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.905667 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.905969 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerName="nova-scheduler-scheduler" containerID="cri-o://726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" gracePeriod=30 Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.934416 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.940430 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-api" containerID="cri-o://5542c7b9f9766535f97696e38456699fce8bc0654b59fd9e4aede52428c46d30" gracePeriod=30 Dec 05 08:48:18 crc kubenswrapper[4795]: I1205 08:48:18.941082 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-log" containerID="cri-o://cd73f97dce42ef39cdf6db7475fafdb6d2970e0666b8ffe52676cd6f819eb7cb" gracePeriod=30 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.001473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.001947 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" containerID="cri-o://51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa" gracePeriod=30 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.002256 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" containerID="cri-o://d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b" gracePeriod=30 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.764534 4795 generic.go:334] "Generic (PLEG): container finished" podID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerID="5542c7b9f9766535f97696e38456699fce8bc0654b59fd9e4aede52428c46d30" exitCode=0 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.765097 4795 generic.go:334] "Generic (PLEG): container finished" podID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerID="cd73f97dce42ef39cdf6db7475fafdb6d2970e0666b8ffe52676cd6f819eb7cb" exitCode=143 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.765150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerDied","Data":"5542c7b9f9766535f97696e38456699fce8bc0654b59fd9e4aede52428c46d30"} Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.765186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerDied","Data":"cd73f97dce42ef39cdf6db7475fafdb6d2970e0666b8ffe52676cd6f819eb7cb"} Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.766600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aee9ade7-8fe6-4548-aa94-032d421ac9ab","Type":"ContainerStarted","Data":"bd6d2f93df87895ba6aa24f249a6eae10babdae342e731be1abfb7e37ad69919"} Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.766651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aee9ade7-8fe6-4548-aa94-032d421ac9ab","Type":"ContainerStarted","Data":"4bfb923c29008ef37e3756f02c75756ac73a7cefa39b37f610337743d589d421"} Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.767905 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerID="51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa" exitCode=143 Dec 05 08:48:19 crc kubenswrapper[4795]: I1205 08:48:19.767925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerDied","Data":"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa"} Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.186184 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.295929 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.296022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.296054 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.296200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.296289 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.296337 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bbj\" (UniqueName: \"kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj\") pod \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\" (UID: \"e091d7a9-ce0f-4a98-ac4d-fed812949fad\") " Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.297408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs" (OuterVolumeSpecName: "logs") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.304358 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj" (OuterVolumeSpecName: "kube-api-access-l4bbj") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "kube-api-access-l4bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.368953 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data" (OuterVolumeSpecName: "config-data") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.399884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.401753 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bbj\" (UniqueName: \"kubernetes.io/projected/e091d7a9-ce0f-4a98-ac4d-fed812949fad-kube-api-access-l4bbj\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.401792 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.401803 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e091d7a9-ce0f-4a98-ac4d-fed812949fad-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.401812 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.443913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.475981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e091d7a9-ce0f-4a98-ac4d-fed812949fad" (UID: "e091d7a9-ce0f-4a98-ac4d-fed812949fad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.506451 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.506482 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e091d7a9-ce0f-4a98-ac4d-fed812949fad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.779920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aee9ade7-8fe6-4548-aa94-032d421ac9ab","Type":"ContainerStarted","Data":"12b807c6ab3f4e20a781ba44e13ecf48748d2484a09c8246365d9d9dae83d955"} Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.781642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e091d7a9-ce0f-4a98-ac4d-fed812949fad","Type":"ContainerDied","Data":"a0adc212a7ec27726b28ea791548911fd3fa23f82eebebe6610b34b6faa2aa39"} Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.781679 4795 scope.go:117] "RemoveContainer" containerID="5542c7b9f9766535f97696e38456699fce8bc0654b59fd9e4aede52428c46d30" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.781887 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.852659 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.853920 4795 scope.go:117] "RemoveContainer" containerID="cd73f97dce42ef39cdf6db7475fafdb6d2970e0666b8ffe52676cd6f819eb7cb" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.866153 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.892520 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.894323 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-api" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.894351 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-api" Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.894375 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22800d9f-49d9-4d82-b097-0e3e52a3d311" containerName="nova-manage" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.894384 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="22800d9f-49d9-4d82-b097-0e3e52a3d311" containerName="nova-manage" Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.894398 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-log" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.894406 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-log" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.897293 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-api" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.922856 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" containerName="nova-api-log" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.931752 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="22800d9f-49d9-4d82-b097-0e3e52a3d311" containerName="nova-manage" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.934264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.934415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.960352 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.961031 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 05 08:48:20 crc kubenswrapper[4795]: I1205 08:48:20.961302 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.970727 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.982355 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.989471 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 05 08:48:20 crc kubenswrapper[4795]: E1205 08:48:20.989556 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerName="nova-scheduler-scheduler" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.035589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.035996 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-config-data\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.036086 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.036293 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a69c13-aa37-4fad-a00f-2c1aafc627c4-logs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.036372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.036506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlz4j\" (UniqueName: \"kubernetes.io/projected/51a69c13-aa37-4fad-a00f-2c1aafc627c4-kube-api-access-jlz4j\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143274 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a69c13-aa37-4fad-a00f-2c1aafc627c4-logs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143351 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlz4j\" (UniqueName: \"kubernetes.io/projected/51a69c13-aa37-4fad-a00f-2c1aafc627c4-kube-api-access-jlz4j\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143506 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-config-data\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.143525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.144075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a69c13-aa37-4fad-a00f-2c1aafc627c4-logs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.156653 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.156666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-public-tls-certs\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.157216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.158828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a69c13-aa37-4fad-a00f-2c1aafc627c4-config-data\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.166715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlz4j\" (UniqueName: \"kubernetes.io/projected/51a69c13-aa37-4fad-a00f-2c1aafc627c4-kube-api-access-jlz4j\") pod \"nova-api-0\" (UID: \"51a69c13-aa37-4fad-a00f-2c1aafc627c4\") " pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.307735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.797022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aee9ade7-8fe6-4548-aa94-032d421ac9ab","Type":"ContainerStarted","Data":"b67d809b1bb0e5b56fe515d9a1a88e56b3f6d8f1b40546626377a8957eda476a"} Dec 05 08:48:21 crc kubenswrapper[4795]: I1205 08:48:21.911078 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 05 08:48:21 crc kubenswrapper[4795]: W1205 08:48:21.951918 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a69c13_aa37_4fad_a00f_2c1aafc627c4.slice/crio-aff9940698e10598a2d0d237883d2a5b6aef73a8df47dba85bbaaf17ccedab37 WatchSource:0}: Error finding container aff9940698e10598a2d0d237883d2a5b6aef73a8df47dba85bbaaf17ccedab37: Status 404 returned error can't find the container with id aff9940698e10598a2d0d237883d2a5b6aef73a8df47dba85bbaaf17ccedab37 Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.255943 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58518->10.217.0.196:8775: read: connection reset by peer" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.257688 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58528->10.217.0.196:8775: read: connection reset by peer" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.670640 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.763118 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e091d7a9-ce0f-4a98-ac4d-fed812949fad" path="/var/lib/kubelet/pods/e091d7a9-ce0f-4a98-ac4d-fed812949fad/volumes" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.808357 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data\") pod \"d371839f-fc5b-4287-bccf-f0077497e3e2\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.808444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle\") pod \"d371839f-fc5b-4287-bccf-f0077497e3e2\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.808668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlpb8\" (UniqueName: \"kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8\") pod \"d371839f-fc5b-4287-bccf-f0077497e3e2\" (UID: \"d371839f-fc5b-4287-bccf-f0077497e3e2\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.825523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8" (OuterVolumeSpecName: "kube-api-access-tlpb8") pod "d371839f-fc5b-4287-bccf-f0077497e3e2" (UID: "d371839f-fc5b-4287-bccf-f0077497e3e2"). InnerVolumeSpecName "kube-api-access-tlpb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.833056 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.833849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51a69c13-aa37-4fad-a00f-2c1aafc627c4","Type":"ContainerStarted","Data":"c88848e741a282e2b50f00ed1366b85a59b14e11a68b5771aa7e1ddd513a06ab"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.833894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51a69c13-aa37-4fad-a00f-2c1aafc627c4","Type":"ContainerStarted","Data":"aff9940698e10598a2d0d237883d2a5b6aef73a8df47dba85bbaaf17ccedab37"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.847974 4795 generic.go:334] "Generic (PLEG): container finished" podID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" exitCode=0 Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.848115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d371839f-fc5b-4287-bccf-f0077497e3e2","Type":"ContainerDied","Data":"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.848152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d371839f-fc5b-4287-bccf-f0077497e3e2","Type":"ContainerDied","Data":"beb1d8ba660111f8edac80cd30c4f87730c3b9d0b8bdc9e90308e8083fedf244"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.848177 4795 scope.go:117] "RemoveContainer" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.848333 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.892878 4795 generic.go:334] "Generic (PLEG): container finished" podID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerID="d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b" exitCode=0 Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.892964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerDied","Data":"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.893340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab362166-6bc7-4832-932f-ba8417bf89e9","Type":"ContainerDied","Data":"532db7dda5b4de7735ed801ca6827b756def688b709de2d6d63c97e2f8e57cc1"} Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.893552 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.910806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle\") pod \"ab362166-6bc7-4832-932f-ba8417bf89e9\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.911219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs\") pod \"ab362166-6bc7-4832-932f-ba8417bf89e9\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.911339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksc9n\" (UniqueName: \"kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n\") pod \"ab362166-6bc7-4832-932f-ba8417bf89e9\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.911464 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs\") pod \"ab362166-6bc7-4832-932f-ba8417bf89e9\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.911797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data\") pod \"ab362166-6bc7-4832-932f-ba8417bf89e9\" (UID: \"ab362166-6bc7-4832-932f-ba8417bf89e9\") " Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.912481 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlpb8\" (UniqueName: \"kubernetes.io/projected/d371839f-fc5b-4287-bccf-f0077497e3e2-kube-api-access-tlpb8\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.933363 4795 scope.go:117] "RemoveContainer" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.936264 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs" (OuterVolumeSpecName: "logs") pod "ab362166-6bc7-4832-932f-ba8417bf89e9" (UID: "ab362166-6bc7-4832-932f-ba8417bf89e9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:48:22 crc kubenswrapper[4795]: E1205 08:48:22.939468 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f\": container with ID starting with 726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f not found: ID does not exist" containerID="726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.939527 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f"} err="failed to get container status \"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f\": rpc error: code = NotFound desc = could not find container \"726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f\": container with ID starting with 726a269b2bbab055b89926cc3461ac4dc1c9f84f1134c324cc15fa4f291d918f not found: ID does not exist" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.939559 4795 scope.go:117] "RemoveContainer" containerID="d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b" Dec 05 08:48:22 crc kubenswrapper[4795]: I1205 08:48:22.977650 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n" (OuterVolumeSpecName: "kube-api-access-ksc9n") pod "ab362166-6bc7-4832-932f-ba8417bf89e9" (UID: "ab362166-6bc7-4832-932f-ba8417bf89e9"). InnerVolumeSpecName "kube-api-access-ksc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.016493 4795 scope.go:117] "RemoveContainer" containerID="51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.033593 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksc9n\" (UniqueName: \"kubernetes.io/projected/ab362166-6bc7-4832-932f-ba8417bf89e9-kube-api-access-ksc9n\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.033672 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab362166-6bc7-4832-932f-ba8417bf89e9-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.106971 4795 scope.go:117] "RemoveContainer" containerID="d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b" Dec 05 08:48:23 crc kubenswrapper[4795]: E1205 08:48:23.109729 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b\": container with ID starting with d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b not found: ID does not exist" containerID="d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.109767 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b"} err="failed to get container status \"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b\": rpc error: code = NotFound desc = could not find container \"d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b\": container with ID starting with d29e6d1c8abdb5c4b5d46d24b731e9029cf5ecd5f48e9cc79d0d753ad55c1c7b not found: ID does not exist" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.109798 4795 scope.go:117] "RemoveContainer" containerID="51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa" Dec 05 08:48:23 crc kubenswrapper[4795]: E1205 08:48:23.111327 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa\": container with ID starting with 51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa not found: ID does not exist" containerID="51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.111384 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa"} err="failed to get container status \"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa\": rpc error: code = NotFound desc = could not find container \"51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa\": container with ID starting with 51bbd3b1b788e0e7cbfa475706111e206b453c2fc08c624fb541dfdd1293ceaa not found: ID does not exist" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.137863 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data" (OuterVolumeSpecName: "config-data") pod "d371839f-fc5b-4287-bccf-f0077497e3e2" (UID: "d371839f-fc5b-4287-bccf-f0077497e3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.239357 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.266890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d371839f-fc5b-4287-bccf-f0077497e3e2" (UID: "d371839f-fc5b-4287-bccf-f0077497e3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.289958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data" (OuterVolumeSpecName: "config-data") pod "ab362166-6bc7-4832-932f-ba8417bf89e9" (UID: "ab362166-6bc7-4832-932f-ba8417bf89e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.308908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab362166-6bc7-4832-932f-ba8417bf89e9" (UID: "ab362166-6bc7-4832-932f-ba8417bf89e9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.342603 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.342943 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ab362166-6bc7-4832-932f-ba8417bf89e9" (UID: "ab362166-6bc7-4832-932f-ba8417bf89e9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.342970 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d371839f-fc5b-4287-bccf-f0077497e3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.343064 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.445110 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab362166-6bc7-4832-932f-ba8417bf89e9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.529163 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.580464 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.613873 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: E1205 08:48:23.614488 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614509 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" Dec 05 08:48:23 crc kubenswrapper[4795]: E1205 08:48:23.614544 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerName="nova-scheduler-scheduler" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614553 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerName="nova-scheduler-scheduler" Dec 05 08:48:23 crc kubenswrapper[4795]: E1205 08:48:23.614585 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614593 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614817 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" containerName="nova-scheduler-scheduler" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614841 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-metadata" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.614852 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" containerName="nova-metadata-log" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.615790 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.621193 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.652684 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.686800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.719719 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.737651 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.744505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.752194 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.752324 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.759324 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.769187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.769252 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-469h8\" (UniqueName: \"kubernetes.io/projected/cf163493-6f4f-47b8-9478-683cf5f07868-kube-api-access-469h8\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.769294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-config-data\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871101 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-config-data\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871161 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a74420-332a-4b9c-b677-d7c61bb7ce5e-logs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgcdl\" (UniqueName: \"kubernetes.io/projected/67a74420-332a-4b9c-b677-d7c61bb7ce5e-kube-api-access-pgcdl\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871308 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-469h8\" (UniqueName: \"kubernetes.io/projected/cf163493-6f4f-47b8-9478-683cf5f07868-kube-api-access-469h8\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.871360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-config-data\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.878781 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.904036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf163493-6f4f-47b8-9478-683cf5f07868-config-data\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.927651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-469h8\" (UniqueName: \"kubernetes.io/projected/cf163493-6f4f-47b8-9478-683cf5f07868-kube-api-access-469h8\") pod \"nova-scheduler-0\" (UID: \"cf163493-6f4f-47b8-9478-683cf5f07868\") " pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.930015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51a69c13-aa37-4fad-a00f-2c1aafc627c4","Type":"ContainerStarted","Data":"f5f8826fe52d4877876d582eaa1884c10b354cfddc05afbd429fc5cc01ebe9fd"} Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.954122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aee9ade7-8fe6-4548-aa94-032d421ac9ab","Type":"ContainerStarted","Data":"23723d78b0dd0de4caf73d1e73d1c5419c126c5c7824581a4cf8258cbb6dc4ef"} Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.954361 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.963113 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.966889 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.96686636 podStartE2EDuration="3.96686636s" podCreationTimestamp="2025-12-05 08:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:23.959933873 +0000 UTC m=+1455.532537612" watchObservedRunningTime="2025-12-05 08:48:23.96686636 +0000 UTC m=+1455.539470099" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.973796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.973867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-config-data\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.973899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a74420-332a-4b9c-b677-d7c61bb7ce5e-logs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.973930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.973966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgcdl\" (UniqueName: \"kubernetes.io/projected/67a74420-332a-4b9c-b677-d7c61bb7ce5e-kube-api-access-pgcdl\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.975307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67a74420-332a-4b9c-b677-d7c61bb7ce5e-logs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.984273 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.984535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.984872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a74420-332a-4b9c-b677-d7c61bb7ce5e-config-data\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:23 crc kubenswrapper[4795]: I1205 08:48:23.996154 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgcdl\" (UniqueName: \"kubernetes.io/projected/67a74420-332a-4b9c-b677-d7c61bb7ce5e-kube-api-access-pgcdl\") pod \"nova-metadata-0\" (UID: \"67a74420-332a-4b9c-b677-d7c61bb7ce5e\") " pod="openstack/nova-metadata-0" Dec 05 08:48:24 crc kubenswrapper[4795]: I1205 08:48:24.015247 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.398508357 podStartE2EDuration="7.0152106s" podCreationTimestamp="2025-12-05 08:48:17 +0000 UTC" firstStartedPulling="2025-12-05 08:48:18.757100451 +0000 UTC m=+1450.329704190" lastFinishedPulling="2025-12-05 08:48:22.373802694 +0000 UTC m=+1453.946406433" observedRunningTime="2025-12-05 08:48:24.002882797 +0000 UTC m=+1455.575486536" watchObservedRunningTime="2025-12-05 08:48:24.0152106 +0000 UTC m=+1455.587814339" Dec 05 08:48:24 crc kubenswrapper[4795]: I1205 08:48:24.072480 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 05 08:48:24 crc kubenswrapper[4795]: I1205 08:48:24.760350 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab362166-6bc7-4832-932f-ba8417bf89e9" path="/var/lib/kubelet/pods/ab362166-6bc7-4832-932f-ba8417bf89e9/volumes" Dec 05 08:48:24 crc kubenswrapper[4795]: I1205 08:48:24.762100 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d371839f-fc5b-4287-bccf-f0077497e3e2" path="/var/lib/kubelet/pods/d371839f-fc5b-4287-bccf-f0077497e3e2/volumes" Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.046979 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.047241 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.048980 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965"} pod="openstack/horizon-797f5f5996-7wlp4" containerMessage="Container horizon failed startup probe, will be restarted" Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.049034 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" containerID="cri-o://8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965" gracePeriod=30 Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.076244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.106195 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.216387 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.991515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf163493-6f4f-47b8-9478-683cf5f07868","Type":"ContainerStarted","Data":"7e2a3b2de6a475c205cf0405eca2dc137cc287cd1c64c0a0f084f1b37f60016a"} Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.993373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cf163493-6f4f-47b8-9478-683cf5f07868","Type":"ContainerStarted","Data":"2a4f3f1236b152c4211af1a027f424d913abeea12867ce16f3c1fe6b04c2e8ee"} Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.996148 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67a74420-332a-4b9c-b677-d7c61bb7ce5e","Type":"ContainerStarted","Data":"ad12affa6e2ddf028c1a0ee86814280fc0a2a937151dcfb3ed06e0c994af6df5"} Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.996282 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67a74420-332a-4b9c-b677-d7c61bb7ce5e","Type":"ContainerStarted","Data":"059ff1b98cf66c8cde068e2157e9f78cf99441c579d6d9f601ab76bf8948d2ba"} Dec 05 08:48:25 crc kubenswrapper[4795]: I1205 08:48:25.996304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"67a74420-332a-4b9c-b677-d7c61bb7ce5e","Type":"ContainerStarted","Data":"4c584b8d9cf9352e2516c569bc35c54fa6a96a93cb021fa76992866e49983c90"} Dec 05 08:48:26 crc kubenswrapper[4795]: I1205 08:48:26.024547 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.024510359 podStartE2EDuration="3.024510359s" podCreationTimestamp="2025-12-05 08:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:26.013673915 +0000 UTC m=+1457.586277664" watchObservedRunningTime="2025-12-05 08:48:26.024510359 +0000 UTC m=+1457.597114098" Dec 05 08:48:26 crc kubenswrapper[4795]: I1205 08:48:26.041010 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.040978094 podStartE2EDuration="3.040978094s" podCreationTimestamp="2025-12-05 08:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:48:26.038787626 +0000 UTC m=+1457.611391385" watchObservedRunningTime="2025-12-05 08:48:26.040978094 +0000 UTC m=+1457.613581833" Dec 05 08:48:27 crc kubenswrapper[4795]: I1205 08:48:27.502273 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57b485fdb4-h9cjs" Dec 05 08:48:27 crc kubenswrapper[4795]: I1205 08:48:27.615914 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:48:28 crc kubenswrapper[4795]: I1205 08:48:28.964300 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.032328 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965" exitCode=0 Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.032482 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965"} Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.032606 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" containerID="cri-o://e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a" gracePeriod=30 Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.032692 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-797f5f5996-7wlp4" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon-log" containerID="cri-o://f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f" gracePeriod=30 Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.032929 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerStarted","Data":"e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a"} Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.033038 4795 scope.go:117] "RemoveContainer" containerID="2b1896dbf9af209dafcaec7ed5c0e7f124f57325e662ab2dcc06df5dc35609e4" Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.073573 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:48:29 crc kubenswrapper[4795]: I1205 08:48:29.073653 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 05 08:48:30 crc kubenswrapper[4795]: I1205 08:48:30.032470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:48:31 crc kubenswrapper[4795]: I1205 08:48:31.308185 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:48:31 crc kubenswrapper[4795]: I1205 08:48:31.308649 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 05 08:48:32 crc kubenswrapper[4795]: I1205 08:48:32.319819 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51a69c13-aa37-4fad-a00f-2c1aafc627c4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:32 crc kubenswrapper[4795]: I1205 08:48:32.319819 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51a69c13-aa37-4fad-a00f-2c1aafc627c4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:33 crc kubenswrapper[4795]: I1205 08:48:33.964508 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 05 08:48:34 crc kubenswrapper[4795]: I1205 08:48:34.072830 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 08:48:34 crc kubenswrapper[4795]: I1205 08:48:34.073305 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 05 08:48:34 crc kubenswrapper[4795]: I1205 08:48:34.087096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 05 08:48:34 crc kubenswrapper[4795]: I1205 08:48:34.141897 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 05 08:48:35 crc kubenswrapper[4795]: I1205 08:48:35.085900 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="67a74420-332a-4b9c-b677-d7c61bb7ce5e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:35 crc kubenswrapper[4795]: I1205 08:48:35.085907 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="67a74420-332a-4b9c-b677-d7c61bb7ce5e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 08:48:40 crc kubenswrapper[4795]: I1205 08:48:40.828142 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:48:40 crc kubenswrapper[4795]: I1205 08:48:40.828933 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:48:40 crc kubenswrapper[4795]: I1205 08:48:40.828998 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:48:40 crc kubenswrapper[4795]: I1205 08:48:40.830024 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:48:40 crc kubenswrapper[4795]: I1205 08:48:40.830093 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002" gracePeriod=600 Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.184195 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002" exitCode=0 Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.184417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002"} Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.184576 4795 scope.go:117] "RemoveContainer" containerID="c93ddbd048ff8d41779ab69c4d06b72c0bf8343289b56925c9b595ac0b0536d9" Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.316579 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.319073 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.322980 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 05 08:48:41 crc kubenswrapper[4795]: I1205 08:48:41.339390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 08:48:42 crc kubenswrapper[4795]: I1205 08:48:42.199744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca"} Dec 05 08:48:42 crc kubenswrapper[4795]: I1205 08:48:42.201672 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 05 08:48:42 crc kubenswrapper[4795]: I1205 08:48:42.208231 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 05 08:48:44 crc kubenswrapper[4795]: I1205 08:48:44.082999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 08:48:44 crc kubenswrapper[4795]: I1205 08:48:44.084028 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 05 08:48:44 crc kubenswrapper[4795]: I1205 08:48:44.089458 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 08:48:44 crc kubenswrapper[4795]: I1205 08:48:44.101651 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 05 08:48:48 crc kubenswrapper[4795]: I1205 08:48:48.240003 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 05 08:48:49 crc kubenswrapper[4795]: I1205 08:48:49.398310 4795 scope.go:117] "RemoveContainer" containerID="851941a42cf2fd0a1266f20756cb4f5c90156ec2f0adc75d9897a69d9fa0be7a" Dec 05 08:48:49 crc kubenswrapper[4795]: I1205 08:48:49.441648 4795 scope.go:117] "RemoveContainer" containerID="a7b69bd72ac337f9b3de8982376982fb6ec24e7ef3b434b9fcb5d32f98dc54c5" Dec 05 08:48:56 crc kubenswrapper[4795]: I1205 08:48:56.930883 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:48:56 crc kubenswrapper[4795]: I1205 08:48:56.935560 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:56 crc kubenswrapper[4795]: I1205 08:48:56.945703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.042681 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.042804 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.042892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvfk\" (UniqueName: \"kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.145057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.145192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.145231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvfk\" (UniqueName: \"kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.145774 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.146287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.176593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvfk\" (UniqueName: \"kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk\") pod \"redhat-operators-9f4pt\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.297246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:48:57 crc kubenswrapper[4795]: I1205 08:48:57.912076 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:48:58 crc kubenswrapper[4795]: I1205 08:48:58.390624 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerID="2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d" exitCode=0 Dec 05 08:48:58 crc kubenswrapper[4795]: I1205 08:48:58.390747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerDied","Data":"2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d"} Dec 05 08:48:58 crc kubenswrapper[4795]: I1205 08:48:58.391022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerStarted","Data":"c4572bed67900beb90e9c75775ce3fd9b81845230fb35f6b5b662185269ceac4"} Dec 05 08:48:58 crc kubenswrapper[4795]: I1205 08:48:58.652838 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.432039 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a" exitCode=137 Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.432430 4795 generic.go:334] "Generic (PLEG): container finished" podID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerID="f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f" exitCode=137 Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.432460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a"} Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.432494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f"} Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.432515 4795 scope.go:117] "RemoveContainer" containerID="8d58d2980a0272b8f9412a1472e5d911f076262d4fb4bdca07ace061afd28965" Dec 05 08:48:59 crc kubenswrapper[4795]: E1205 08:48:59.607626 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821b3890_4d8d_4ce0_b3b2_55793a9c98cd.slice/crio-conmon-f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod821b3890_4d8d_4ce0_b3b2_55793a9c98cd.slice/crio-conmon-e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a.scope\": RecentStats: unable to find data in memory cache]" Dec 05 08:48:59 crc kubenswrapper[4795]: I1205 08:48:59.897930 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023424 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023636 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfp5j\" (UniqueName: \"kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023723 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.023953 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data\") pod \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\" (UID: \"821b3890-4d8d-4ce0-b3b2-55793a9c98cd\") " Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.024884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs" (OuterVolumeSpecName: "logs") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.047736 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.068885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j" (OuterVolumeSpecName: "kube-api-access-lfp5j") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "kube-api-access-lfp5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.092536 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data" (OuterVolumeSpecName: "config-data") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.109781 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.129433 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfp5j\" (UniqueName: \"kubernetes.io/projected/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-kube-api-access-lfp5j\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.129476 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-logs\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.129488 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.129497 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.129506 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.148174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts" (OuterVolumeSpecName: "scripts") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.212854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "821b3890-4d8d-4ce0-b3b2-55793a9c98cd" (UID: "821b3890-4d8d-4ce0-b3b2-55793a9c98cd"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.232001 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-scripts\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.232312 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/821b3890-4d8d-4ce0-b3b2-55793a9c98cd-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.448756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerStarted","Data":"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a"} Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.460508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-797f5f5996-7wlp4" event={"ID":"821b3890-4d8d-4ce0-b3b2-55793a9c98cd","Type":"ContainerDied","Data":"88f716fc372ac5bdaf05e54717c04f9411af1fe14f02f0e5a9c4d7a63456e75b"} Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.460555 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-797f5f5996-7wlp4" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.460602 4795 scope.go:117] "RemoveContainer" containerID="e3745c4e0ecc79cb72af2bf1778de70093de8a34de4843f1995644079009604a" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.502824 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.581144 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.606377 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-797f5f5996-7wlp4"] Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.711686 4795 scope.go:117] "RemoveContainer" containerID="f314e1ee64c9bb2c50d114e316eaa8f4a1d8a694b7ce30ec1791781a0e7d7a7f" Dec 05 08:49:00 crc kubenswrapper[4795]: I1205 08:49:00.761009 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" path="/var/lib/kubelet/pods/821b3890-4d8d-4ce0-b3b2-55793a9c98cd/volumes" Dec 05 08:49:05 crc kubenswrapper[4795]: I1205 08:49:05.548491 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="rabbitmq" containerID="cri-o://d28e7a08bec20bcef2029f17c27a0e6d302c0975fd9378fc98d20589526ddab7" gracePeriod=604794 Dec 05 08:49:06 crc kubenswrapper[4795]: I1205 08:49:06.677110 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" containerID="cri-o://3154fd32dc6a1aff1527c514daea43379e594659496943260e3a099d08c27be0" gracePeriod=604794 Dec 05 08:49:07 crc kubenswrapper[4795]: I1205 08:49:07.543036 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerID="a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a" exitCode=0 Dec 05 08:49:07 crc kubenswrapper[4795]: I1205 08:49:07.543099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerDied","Data":"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a"} Dec 05 08:49:08 crc kubenswrapper[4795]: I1205 08:49:08.556303 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerStarted","Data":"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911"} Dec 05 08:49:10 crc kubenswrapper[4795]: I1205 08:49:10.773631 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 05 08:49:11 crc kubenswrapper[4795]: I1205 08:49:11.341195 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 05 08:49:12 crc kubenswrapper[4795]: I1205 08:49:12.596768 4795 generic.go:334] "Generic (PLEG): container finished" podID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerID="d28e7a08bec20bcef2029f17c27a0e6d302c0975fd9378fc98d20589526ddab7" exitCode=0 Dec 05 08:49:12 crc kubenswrapper[4795]: I1205 08:49:12.597156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerDied","Data":"d28e7a08bec20bcef2029f17c27a0e6d302c0975fd9378fc98d20589526ddab7"} Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.301397 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf66k\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303537 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.303646 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.305305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.305757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.314339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k" (OuterVolumeSpecName: "kube-api-access-qf66k") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "kube-api-access-qf66k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.317263 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.321408 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.328256 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info" (OuterVolumeSpecName: "pod-info") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.363090 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9f4pt" podStartSLOduration=7.81170537 podStartE2EDuration="17.363053338s" podCreationTimestamp="2025-12-05 08:48:56 +0000 UTC" firstStartedPulling="2025-12-05 08:48:58.392692199 +0000 UTC m=+1489.965295938" lastFinishedPulling="2025-12-05 08:49:07.944040167 +0000 UTC m=+1499.516643906" observedRunningTime="2025-12-05 08:49:08.58760495 +0000 UTC m=+1500.160208689" watchObservedRunningTime="2025-12-05 08:49:13.363053338 +0000 UTC m=+1504.935657077" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.365126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data" (OuterVolumeSpecName: "config-data") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.406471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.406571 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie\") pod \"956aa512-9ab5-4c74-863b-3ed2a14535d9\" (UID: \"956aa512-9ab5-4c74-863b-3ed2a14535d9\") " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407090 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf66k\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-kube-api-access-qf66k\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407111 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/956aa512-9ab5-4c74-863b-3ed2a14535d9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407126 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407138 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/956aa512-9ab5-4c74-863b-3ed2a14535d9-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407149 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407163 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407174 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.407803 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.412521 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.430586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf" (OuterVolumeSpecName: "server-conf") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.489793 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "956aa512-9ab5-4c74-863b-3ed2a14535d9" (UID: "956aa512-9ab5-4c74-863b-3ed2a14535d9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.510341 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.510399 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.510415 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/956aa512-9ab5-4c74-863b-3ed2a14535d9-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.510427 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/956aa512-9ab5-4c74-863b-3ed2a14535d9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.547778 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.621735 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.638468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"956aa512-9ab5-4c74-863b-3ed2a14535d9","Type":"ContainerDied","Data":"8bc2af43fb858b496565f4d8af07cf291445ddf66d4ddf4d09526c7d558b60c7"} Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.638536 4795 scope.go:117] "RemoveContainer" containerID="d28e7a08bec20bcef2029f17c27a0e6d302c0975fd9378fc98d20589526ddab7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.638746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.687297 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.690562 4795 scope.go:117] "RemoveContainer" containerID="7ee452d101693c0da47a464215ca51c496b979b8ffd1ec947fd8d52320f0ac04" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.700097 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.739984 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740583 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="rabbitmq" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740598 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="rabbitmq" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740629 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="setup-container" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740635 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="setup-container" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740648 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon-log" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740655 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon-log" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740668 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740673 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740680 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740686 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740703 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.740723 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740729 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740925 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740939 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740949 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740966 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740975 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon-log" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.740988 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" containerName="rabbitmq" Dec 05 08:49:17 crc kubenswrapper[4795]: E1205 08:49:13.741214 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.741222 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.741426 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="821b3890-4d8d-4ce0-b3b2-55793a9c98cd" containerName="horizon" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.742287 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.752653 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.752883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.753005 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-q5tjc" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.753740 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.758937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.904033 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.904678 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.908815 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.928683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.928907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ad9b797-2884-4af6-8a64-8f82b3523d3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930044 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpzlm\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-kube-api-access-xpzlm\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930132 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ad9b797-2884-4af6-8a64-8f82b3523d3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:13.930399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.032854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.032914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.032968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033059 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033175 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ad9b797-2884-4af6-8a64-8f82b3523d3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033227 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033249 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpzlm\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-kube-api-access-xpzlm\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.033288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ad9b797-2884-4af6-8a64-8f82b3523d3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.034719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.035085 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.035709 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.035856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.036086 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-config-data\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.036286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7ad9b797-2884-4af6-8a64-8f82b3523d3e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.039527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.040219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.042980 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7ad9b797-2884-4af6-8a64-8f82b3523d3e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.043168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7ad9b797-2884-4af6-8a64-8f82b3523d3e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.056827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpzlm\" (UniqueName: \"kubernetes.io/projected/7ad9b797-2884-4af6-8a64-8f82b3523d3e-kube-api-access-xpzlm\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.072119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"7ad9b797-2884-4af6-8a64-8f82b3523d3e\") " pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.086541 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.655114 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerID="3154fd32dc6a1aff1527c514daea43379e594659496943260e3a099d08c27be0" exitCode=0 Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.655601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerDied","Data":"3154fd32dc6a1aff1527c514daea43379e594659496943260e3a099d08c27be0"} Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:14.761140 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956aa512-9ab5-4c74-863b-3ed2a14535d9" path="/var/lib/kubelet/pods/956aa512-9ab5-4c74-863b-3ed2a14535d9/volumes" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.309933 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.312448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.315285 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.337312 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.496932 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497012 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497571 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrr28\" (UniqueName: \"kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.497828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600025 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600177 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600244 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrr28\" (UniqueName: \"kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.600446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.601720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.601953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.602093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.602495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.602896 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.603675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.639073 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrr28\" (UniqueName: \"kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28\") pod \"dnsmasq-dns-d558885bc-pzpp7\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:16.641243 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:17.298263 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:17.298718 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:17 crc kubenswrapper[4795]: I1205 08:49:17.957173 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.081892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.123152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.134268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.134859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.134932 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.134977 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfsr4\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135168 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135402 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135461 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.135507 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins\") pod \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\" (UID: \"ec8515f5-24b3-4930-9df2-90c25e2f8e6e\") " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.145327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.146775 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4" (OuterVolumeSpecName: "kube-api-access-nfsr4") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "kube-api-access-nfsr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: W1205 08:49:18.150701 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaa232a3_a2a5_42ef_ad79_db32d33c2f75.slice/crio-e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a WatchSource:0}: Error finding container e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a: Status 404 returned error can't find the container with id e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.154893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.156422 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.161192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info" (OuterVolumeSpecName: "pod-info") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.172445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.181545 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.200162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242721 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-pod-info\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242816 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242831 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242841 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242890 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242903 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242913 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfsr4\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-kube-api-access-nfsr4\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.242923 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.285161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data" (OuterVolumeSpecName: "config-data") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.320109 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.337401 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf" (OuterVolumeSpecName: "server-conf") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.345626 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.345667 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-server-conf\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.345683 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.368873 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9f4pt" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="registry-server" probeResult="failure" output=< Dec 05 08:49:18 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 08:49:18 crc kubenswrapper[4795]: > Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.509834 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ec8515f5-24b3-4930-9df2-90c25e2f8e6e" (UID: "ec8515f5-24b3-4930-9df2-90c25e2f8e6e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.551470 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec8515f5-24b3-4930-9df2-90c25e2f8e6e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.711928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerStarted","Data":"13bfbf7fe1f191fe24fe49e4647d6e034191d73e1e2574fc207c2987ae14c730"} Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.711991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerStarted","Data":"e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a"} Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.715622 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ad9b797-2884-4af6-8a64-8f82b3523d3e","Type":"ContainerStarted","Data":"34468e099431ca4f07bad953a2483ab2a650cd953f76a3453cada7de71f4b33f"} Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.717911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ec8515f5-24b3-4930-9df2-90c25e2f8e6e","Type":"ContainerDied","Data":"f42edc77403d08cb5c45460eb21d61a862047365e32e409475770ff85828eb70"} Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.718002 4795 scope.go:117] "RemoveContainer" containerID="3154fd32dc6a1aff1527c514daea43379e594659496943260e3a099d08c27be0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.718250 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.752167 4795 scope.go:117] "RemoveContainer" containerID="1c8eefd545af59a05a444f037361105679aae6c1a607df37c24bb29c03aba3d2" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.796105 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.817603 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.834493 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:18 crc kubenswrapper[4795]: E1205 08:49:18.835060 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.835079 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" Dec 05 08:49:18 crc kubenswrapper[4795]: E1205 08:49:18.835100 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="setup-container" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.835106 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="setup-container" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.835353 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" containerName="rabbitmq" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.836516 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.845667 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.845851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.846029 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.846160 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xtb2s" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.846288 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.846407 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.846533 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.854511 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.971049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.971513 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.971572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5139f934-4821-4038-9401-c22f469bf070-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.973497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.973656 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.973835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.973884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5139f934-4821-4038-9401-c22f469bf070-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.974003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6pvp\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-kube-api-access-j6pvp\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.974064 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.974112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:18 crc kubenswrapper[4795]: I1205 08:49:18.974146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.075919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.075964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076006 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5139f934-4821-4038-9401-c22f469bf070-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076080 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5139f934-4821-4038-9401-c22f469bf070-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6pvp\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-kube-api-access-j6pvp\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076175 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.076368 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.077239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.077740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.077739 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.078205 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.078746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5139f934-4821-4038-9401-c22f469bf070-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.086794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.092515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5139f934-4821-4038-9401-c22f469bf070-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.094978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5139f934-4821-4038-9401-c22f469bf070-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.095728 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.098960 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6pvp\" (UniqueName: \"kubernetes.io/projected/5139f934-4821-4038-9401-c22f469bf070-kube-api-access-j6pvp\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.213102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5139f934-4821-4038-9401-c22f469bf070\") " pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.269804 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.732029 4795 generic.go:334] "Generic (PLEG): container finished" podID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerID="13bfbf7fe1f191fe24fe49e4647d6e034191d73e1e2574fc207c2987ae14c730" exitCode=0 Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.732420 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerDied","Data":"13bfbf7fe1f191fe24fe49e4647d6e034191d73e1e2574fc207c2987ae14c730"} Dec 05 08:49:19 crc kubenswrapper[4795]: I1205 08:49:19.956582 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 05 08:49:20 crc kubenswrapper[4795]: W1205 08:49:20.013569 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5139f934_4821_4038_9401_c22f469bf070.slice/crio-90daee55a330c5bdbd4fe0184e2e980c5139d0eb9152edcfc369f9fef834426d WatchSource:0}: Error finding container 90daee55a330c5bdbd4fe0184e2e980c5139d0eb9152edcfc369f9fef834426d: Status 404 returned error can't find the container with id 90daee55a330c5bdbd4fe0184e2e980c5139d0eb9152edcfc369f9fef834426d Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.759855 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8515f5-24b3-4930-9df2-90c25e2f8e6e" path="/var/lib/kubelet/pods/ec8515f5-24b3-4930-9df2-90c25e2f8e6e/volumes" Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.761659 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.761772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5139f934-4821-4038-9401-c22f469bf070","Type":"ContainerStarted","Data":"90daee55a330c5bdbd4fe0184e2e980c5139d0eb9152edcfc369f9fef834426d"} Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.761837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerStarted","Data":"ce80d725ee6315e111101dafb5d32da3aba4e9418ef8dafa2249abbbb6fcc646"} Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.761892 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ad9b797-2884-4af6-8a64-8f82b3523d3e","Type":"ContainerStarted","Data":"87ffca145b007ea985987f2f625e98b2d0704468c3d7ae342b904a086773c5a1"} Dec 05 08:49:20 crc kubenswrapper[4795]: I1205 08:49:20.795870 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" podStartSLOduration=4.795832809 podStartE2EDuration="4.795832809s" podCreationTimestamp="2025-12-05 08:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:49:20.779504336 +0000 UTC m=+1512.352108075" watchObservedRunningTime="2025-12-05 08:49:20.795832809 +0000 UTC m=+1512.368436548" Dec 05 08:49:21 crc kubenswrapper[4795]: I1205 08:49:21.777761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5139f934-4821-4038-9401-c22f469bf070","Type":"ContainerStarted","Data":"1efca41859888c797476710313ae3d2a10139f8edc7dc5cd9a70a23945161d13"} Dec 05 08:49:26 crc kubenswrapper[4795]: I1205 08:49:26.642848 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:26 crc kubenswrapper[4795]: I1205 08:49:26.728240 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:49:26 crc kubenswrapper[4795]: I1205 08:49:26.728803 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="dnsmasq-dns" containerID="cri-o://1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e" gracePeriod=10 Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.085452 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zwsqp"] Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.087912 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.102255 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zwsqp"] Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186536 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-config\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.186974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.187290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqj5\" (UniqueName: \"kubernetes.io/projected/b895bd86-3c76-4653-8527-d1cfef368c37-kube-api-access-nhqj5\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289633 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqj5\" (UniqueName: \"kubernetes.io/projected/b895bd86-3c76-4653-8527-d1cfef368c37-kube-api-access-nhqj5\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289775 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289795 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-config\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.289884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.291248 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.291302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.291524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.291680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-config\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.292044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.292120 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b895bd86-3c76-4653-8527-d1cfef368c37-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.326871 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqj5\" (UniqueName: \"kubernetes.io/projected/b895bd86-3c76-4653-8527-d1cfef368c37-kube-api-access-nhqj5\") pod \"dnsmasq-dns-67cb876dc9-zwsqp\" (UID: \"b895bd86-3c76-4653-8527-d1cfef368c37\") " pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.359200 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.407680 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.411771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.485979 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.493819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.493879 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.494003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgv49\" (UniqueName: \"kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.494085 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.494239 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.494331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0\") pod \"d15fba4a-f47b-4143-ba07-6d368e19f33f\" (UID: \"d15fba4a-f47b-4143-ba07-6d368e19f33f\") " Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.552391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49" (OuterVolumeSpecName: "kube-api-access-jgv49") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "kube-api-access-jgv49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.590131 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.598089 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.598156 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgv49\" (UniqueName: \"kubernetes.io/projected/d15fba4a-f47b-4143-ba07-6d368e19f33f-kube-api-access-jgv49\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.616754 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.617462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config" (OuterVolumeSpecName: "config") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.626003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.629271 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d15fba4a-f47b-4143-ba07-6d368e19f33f" (UID: "d15fba4a-f47b-4143-ba07-6d368e19f33f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.700500 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.700535 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.700550 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.700561 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d15fba4a-f47b-4143-ba07-6d368e19f33f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.858191 4795 generic.go:334] "Generic (PLEG): container finished" podID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerID="1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e" exitCode=0 Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.858258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" event={"ID":"d15fba4a-f47b-4143-ba07-6d368e19f33f","Type":"ContainerDied","Data":"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e"} Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.858328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.858354 4795 scope.go:117] "RemoveContainer" containerID="1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.858335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-tbd8b" event={"ID":"d15fba4a-f47b-4143-ba07-6d368e19f33f","Type":"ContainerDied","Data":"323195fb26c1f87a425aad14432b7b2af7fc3a49134602cae4b250e05d04550e"} Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.928386 4795 scope.go:117] "RemoveContainer" containerID="c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.940531 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.950342 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-tbd8b"] Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.960813 4795 scope.go:117] "RemoveContainer" containerID="1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e" Dec 05 08:49:27 crc kubenswrapper[4795]: E1205 08:49:27.961202 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e\": container with ID starting with 1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e not found: ID does not exist" containerID="1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.961238 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e"} err="failed to get container status \"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e\": rpc error: code = NotFound desc = could not find container \"1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e\": container with ID starting with 1682da133f99f2b90536bbf508d1d21fdb974bc9a6ef6e2f7a8b12b5bd2b942e not found: ID does not exist" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.961261 4795 scope.go:117] "RemoveContainer" containerID="c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235" Dec 05 08:49:27 crc kubenswrapper[4795]: E1205 08:49:27.961480 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235\": container with ID starting with c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235 not found: ID does not exist" containerID="c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235" Dec 05 08:49:27 crc kubenswrapper[4795]: I1205 08:49:27.961501 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235"} err="failed to get container status \"c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235\": rpc error: code = NotFound desc = could not find container \"c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235\": container with ID starting with c0bed83385b461d2da61506edd1c0a2166339276f416c1a9fc8c25b60c20a235 not found: ID does not exist" Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.040514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-zwsqp"] Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.131789 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.771361 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" path="/var/lib/kubelet/pods/d15fba4a-f47b-4143-ba07-6d368e19f33f/volumes" Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.875440 4795 generic.go:334] "Generic (PLEG): container finished" podID="b895bd86-3c76-4653-8527-d1cfef368c37" containerID="cbbb00e44eb204fe9f0d437931dcc3d26f6e7a29f50ebb8e963daa437ed69708" exitCode=0 Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.875875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" event={"ID":"b895bd86-3c76-4653-8527-d1cfef368c37","Type":"ContainerDied","Data":"cbbb00e44eb204fe9f0d437931dcc3d26f6e7a29f50ebb8e963daa437ed69708"} Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.876021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" event={"ID":"b895bd86-3c76-4653-8527-d1cfef368c37","Type":"ContainerStarted","Data":"f14ddeea87d88f2112b2f67ecbf517e32f4f3757367ed6cf8b081bd859f04783"} Dec 05 08:49:28 crc kubenswrapper[4795]: I1205 08:49:28.881020 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9f4pt" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="registry-server" containerID="cri-o://03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911" gracePeriod=2 Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.604419 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.766408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwvfk\" (UniqueName: \"kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk\") pod \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.767912 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content\") pod \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.767975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities\") pod \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\" (UID: \"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc\") " Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.768887 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities" (OuterVolumeSpecName: "utilities") pod "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" (UID: "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.773356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk" (OuterVolumeSpecName: "kube-api-access-nwvfk") pod "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" (UID: "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc"). InnerVolumeSpecName "kube-api-access-nwvfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.869756 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwvfk\" (UniqueName: \"kubernetes.io/projected/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-kube-api-access-nwvfk\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.869786 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.891668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" event={"ID":"b895bd86-3c76-4653-8527-d1cfef368c37","Type":"ContainerStarted","Data":"68fc37bcc5a63747c0e86eeffd89dfa69a66dbb95712f794ea0b13471ae0d78d"} Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.891820 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.901375 4795 generic.go:334] "Generic (PLEG): container finished" podID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerID="03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911" exitCode=0 Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.901526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerDied","Data":"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911"} Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.901654 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9f4pt" event={"ID":"9e0b1da9-ee45-453b-82c7-da25ef5cd6bc","Type":"ContainerDied","Data":"c4572bed67900beb90e9c75775ce3fd9b81845230fb35f6b5b662185269ceac4"} Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.901728 4795 scope.go:117] "RemoveContainer" containerID="03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.901906 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9f4pt" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.922579 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" podStartSLOduration=2.92255492 podStartE2EDuration="2.92255492s" podCreationTimestamp="2025-12-05 08:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:49:29.913194317 +0000 UTC m=+1521.485798056" watchObservedRunningTime="2025-12-05 08:49:29.92255492 +0000 UTC m=+1521.495158679" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.931438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" (UID: "9e0b1da9-ee45-453b-82c7-da25ef5cd6bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.969047 4795 scope.go:117] "RemoveContainer" containerID="a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a" Dec 05 08:49:29 crc kubenswrapper[4795]: I1205 08:49:29.971808 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.022414 4795 scope.go:117] "RemoveContainer" containerID="2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.057018 4795 scope.go:117] "RemoveContainer" containerID="03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911" Dec 05 08:49:30 crc kubenswrapper[4795]: E1205 08:49:30.057789 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911\": container with ID starting with 03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911 not found: ID does not exist" containerID="03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.057828 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911"} err="failed to get container status \"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911\": rpc error: code = NotFound desc = could not find container \"03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911\": container with ID starting with 03be9667c62bb6d541e9250fe162f89878e35e50766c7ab602ebc58178d1b911 not found: ID does not exist" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.057855 4795 scope.go:117] "RemoveContainer" containerID="a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a" Dec 05 08:49:30 crc kubenswrapper[4795]: E1205 08:49:30.058486 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a\": container with ID starting with a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a not found: ID does not exist" containerID="a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.058520 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a"} err="failed to get container status \"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a\": rpc error: code = NotFound desc = could not find container \"a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a\": container with ID starting with a6838de5cbd80579cac8b5fa2f543f447fd8cf6e25ec67855540061d40bd785a not found: ID does not exist" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.058538 4795 scope.go:117] "RemoveContainer" containerID="2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d" Dec 05 08:49:30 crc kubenswrapper[4795]: E1205 08:49:30.058981 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d\": container with ID starting with 2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d not found: ID does not exist" containerID="2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.059018 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d"} err="failed to get container status \"2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d\": rpc error: code = NotFound desc = could not find container \"2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d\": container with ID starting with 2066bdad51a4240b24a1d475063a57dc71c6f36e33a5ed3d096859257acb816d not found: ID does not exist" Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.242436 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.254060 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9f4pt"] Dec 05 08:49:30 crc kubenswrapper[4795]: I1205 08:49:30.759260 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" path="/var/lib/kubelet/pods/9e0b1da9-ee45-453b-82c7-da25ef5cd6bc/volumes" Dec 05 08:49:37 crc kubenswrapper[4795]: I1205 08:49:37.414844 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67cb876dc9-zwsqp" Dec 05 08:49:37 crc kubenswrapper[4795]: I1205 08:49:37.507276 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:37 crc kubenswrapper[4795]: I1205 08:49:37.507605 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="dnsmasq-dns" containerID="cri-o://ce80d725ee6315e111101dafb5d32da3aba4e9418ef8dafa2249abbbb6fcc646" gracePeriod=10 Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.010840 4795 generic.go:334] "Generic (PLEG): container finished" podID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerID="ce80d725ee6315e111101dafb5d32da3aba4e9418ef8dafa2249abbbb6fcc646" exitCode=0 Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.011032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerDied","Data":"ce80d725ee6315e111101dafb5d32da3aba4e9418ef8dafa2249abbbb6fcc646"} Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.011394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" event={"ID":"caa232a3-a2a5-42ef-ad79-db32d33c2f75","Type":"ContainerDied","Data":"e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a"} Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.011450 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ee7f0aeb746e9f98ee68e41ffd7e523b96b3e594e974b130a0572e7a11ff3a" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.094390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.261411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.261471 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.261543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.261597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrr28\" (UniqueName: \"kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.261636 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.262027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.262124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam\") pod \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\" (UID: \"caa232a3-a2a5-42ef-ad79-db32d33c2f75\") " Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.270832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28" (OuterVolumeSpecName: "kube-api-access-jrr28") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "kube-api-access-jrr28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.331211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.339406 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.342231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.342739 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config" (OuterVolumeSpecName: "config") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.343815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.350738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "caa232a3-a2a5-42ef-ad79-db32d33c2f75" (UID: "caa232a3-a2a5-42ef-ad79-db32d33c2f75"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365108 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365157 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365170 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365184 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-config\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365197 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrr28\" (UniqueName: \"kubernetes.io/projected/caa232a3-a2a5-42ef-ad79-db32d33c2f75-kube-api-access-jrr28\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365213 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:38 crc kubenswrapper[4795]: I1205 08:49:38.365225 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/caa232a3-a2a5-42ef-ad79-db32d33c2f75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 05 08:49:39 crc kubenswrapper[4795]: I1205 08:49:39.069363 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-pzpp7" Dec 05 08:49:39 crc kubenswrapper[4795]: I1205 08:49:39.124929 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:39 crc kubenswrapper[4795]: I1205 08:49:39.144159 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-pzpp7"] Dec 05 08:49:40 crc kubenswrapper[4795]: I1205 08:49:40.761919 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" path="/var/lib/kubelet/pods/caa232a3-a2a5-42ef-ad79-db32d33c2f75/volumes" Dec 05 08:49:49 crc kubenswrapper[4795]: I1205 08:49:49.764687 4795 scope.go:117] "RemoveContainer" containerID="68bffd95884faa6a9f67416889f9419e2cf349d11a221f0567998d18b4afefc9" Dec 05 08:49:49 crc kubenswrapper[4795]: I1205 08:49:49.797659 4795 scope.go:117] "RemoveContainer" containerID="68acd4634fef3c7693199a79755e626884bbb5a02aea88b58dffb628e4c5a292" Dec 05 08:49:49 crc kubenswrapper[4795]: I1205 08:49:49.829074 4795 scope.go:117] "RemoveContainer" containerID="1b4c7de9fcb061cea40e9db8c57dfb855602c2e628cd66646181cfb67fb201e1" Dec 05 08:49:49 crc kubenswrapper[4795]: I1205 08:49:49.910855 4795 scope.go:117] "RemoveContainer" containerID="55846ea4e6e7ef73251e4dd94a954dede2cbec9808a4b9564d739573c9c9be79" Dec 05 08:49:53 crc kubenswrapper[4795]: I1205 08:49:53.251874 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ad9b797-2884-4af6-8a64-8f82b3523d3e" containerID="87ffca145b007ea985987f2f625e98b2d0704468c3d7ae342b904a086773c5a1" exitCode=0 Dec 05 08:49:53 crc kubenswrapper[4795]: I1205 08:49:53.252083 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ad9b797-2884-4af6-8a64-8f82b3523d3e","Type":"ContainerDied","Data":"87ffca145b007ea985987f2f625e98b2d0704468c3d7ae342b904a086773c5a1"} Dec 05 08:49:54 crc kubenswrapper[4795]: I1205 08:49:54.267711 4795 generic.go:334] "Generic (PLEG): container finished" podID="5139f934-4821-4038-9401-c22f469bf070" containerID="1efca41859888c797476710313ae3d2a10139f8edc7dc5cd9a70a23945161d13" exitCode=0 Dec 05 08:49:54 crc kubenswrapper[4795]: I1205 08:49:54.267806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5139f934-4821-4038-9401-c22f469bf070","Type":"ContainerDied","Data":"1efca41859888c797476710313ae3d2a10139f8edc7dc5cd9a70a23945161d13"} Dec 05 08:49:54 crc kubenswrapper[4795]: I1205 08:49:54.273405 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7ad9b797-2884-4af6-8a64-8f82b3523d3e","Type":"ContainerStarted","Data":"d8392b582126656f046ea6b9f352f1730b3425768b87638a2473479017256aff"} Dec 05 08:49:54 crc kubenswrapper[4795]: I1205 08:49:54.274969 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 05 08:49:54 crc kubenswrapper[4795]: I1205 08:49:54.386547 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=41.386518823 podStartE2EDuration="41.386518823s" podCreationTimestamp="2025-12-05 08:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:49:54.378929927 +0000 UTC m=+1545.951533676" watchObservedRunningTime="2025-12-05 08:49:54.386518823 +0000 UTC m=+1545.959122562" Dec 05 08:49:55 crc kubenswrapper[4795]: I1205 08:49:55.285281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5139f934-4821-4038-9401-c22f469bf070","Type":"ContainerStarted","Data":"f44af2c6d109ba5e0aa9505b668cabd5ea96d22b129d986c5189fdafc0d1be89"} Dec 05 08:49:55 crc kubenswrapper[4795]: I1205 08:49:55.286605 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:49:55 crc kubenswrapper[4795]: I1205 08:49:55.436799 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.436770588 podStartE2EDuration="37.436770588s" podCreationTimestamp="2025-12-05 08:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 08:49:55.433338745 +0000 UTC m=+1547.005942484" watchObservedRunningTime="2025-12-05 08:49:55.436770588 +0000 UTC m=+1547.009374327" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.860667 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd"] Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862037 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="extract-utilities" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862058 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="extract-utilities" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862093 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862109 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="init" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862119 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="init" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="registry-server" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862138 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="registry-server" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862157 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862164 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862176 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="init" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862182 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="init" Dec 05 08:50:00 crc kubenswrapper[4795]: E1205 08:50:00.862198 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="extract-content" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862204 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="extract-content" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862419 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0b1da9-ee45-453b-82c7-da25ef5cd6bc" containerName="registry-server" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862429 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15fba4a-f47b-4143-ba07-6d368e19f33f" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.862450 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa232a3-a2a5-42ef-ad79-db32d33c2f75" containerName="dnsmasq-dns" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.863237 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.866138 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.866530 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.866804 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.867180 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.886467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.886509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5rwr\" (UniqueName: \"kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.886565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.886652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.904201 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd"] Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.988263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.988626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5rwr\" (UniqueName: \"kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.988740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.988904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.996907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:00 crc kubenswrapper[4795]: I1205 08:50:00.998389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:01 crc kubenswrapper[4795]: I1205 08:50:01.013571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5rwr\" (UniqueName: \"kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:01 crc kubenswrapper[4795]: I1205 08:50:01.013817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:01 crc kubenswrapper[4795]: I1205 08:50:01.185831 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:01 crc kubenswrapper[4795]: I1205 08:50:01.965943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd"] Dec 05 08:50:02 crc kubenswrapper[4795]: I1205 08:50:02.372426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" event={"ID":"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc","Type":"ContainerStarted","Data":"b9148532c7abfa3be3abf42385edaa217d891a74750827de6a0aef60bc046ae2"} Dec 05 08:50:04 crc kubenswrapper[4795]: I1205 08:50:04.093447 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="7ad9b797-2884-4af6-8a64-8f82b3523d3e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.205:5671: connect: connection refused" Dec 05 08:50:09 crc kubenswrapper[4795]: I1205 08:50:09.273382 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 05 08:50:14 crc kubenswrapper[4795]: I1205 08:50:14.089974 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.617920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.622090 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.634091 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.776142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jkm\" (UniqueName: \"kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.776346 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.777358 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.880008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.880668 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.880110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jkm\" (UniqueName: \"kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.880767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.881076 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.922730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jkm\" (UniqueName: \"kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm\") pod \"certified-operators-57bs5\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:36 crc kubenswrapper[4795]: I1205 08:50:36.957377 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:38 crc kubenswrapper[4795]: I1205 08:50:38.491932 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:50:38 crc kubenswrapper[4795]: I1205 08:50:38.885780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" event={"ID":"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc","Type":"ContainerStarted","Data":"84c82ec65ca1b9971bb341d8a22a0605a80b7d6b049669f7bcb8ec4052b233a2"} Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.110651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" podStartSLOduration=2.599391681 podStartE2EDuration="39.110629845s" podCreationTimestamp="2025-12-05 08:50:00 +0000 UTC" firstStartedPulling="2025-12-05 08:50:01.976407133 +0000 UTC m=+1553.549010872" lastFinishedPulling="2025-12-05 08:50:38.487645297 +0000 UTC m=+1590.060249036" observedRunningTime="2025-12-05 08:50:38.925444947 +0000 UTC m=+1590.498048686" watchObservedRunningTime="2025-12-05 08:50:39.110629845 +0000 UTC m=+1590.683233584" Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.115605 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:39 crc kubenswrapper[4795]: W1205 08:50:39.116385 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod820af2df_905b_43b4_b1ff_a1af84cc33e7.slice/crio-9d85af0a58631ca37f355b5d9e208e5fe820490d14243b455a3f45b6863cb67f WatchSource:0}: Error finding container 9d85af0a58631ca37f355b5d9e208e5fe820490d14243b455a3f45b6863cb67f: Status 404 returned error can't find the container with id 9d85af0a58631ca37f355b5d9e208e5fe820490d14243b455a3f45b6863cb67f Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.905440 4795 generic.go:334] "Generic (PLEG): container finished" podID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerID="5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25" exitCode=0 Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.906140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerDied","Data":"5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25"} Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.906252 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerStarted","Data":"9d85af0a58631ca37f355b5d9e208e5fe820490d14243b455a3f45b6863cb67f"} Dec 05 08:50:39 crc kubenswrapper[4795]: I1205 08:50:39.915521 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:50:40 crc kubenswrapper[4795]: I1205 08:50:40.922394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerStarted","Data":"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac"} Dec 05 08:50:42 crc kubenswrapper[4795]: I1205 08:50:42.943144 4795 generic.go:334] "Generic (PLEG): container finished" podID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerID="3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac" exitCode=0 Dec 05 08:50:42 crc kubenswrapper[4795]: I1205 08:50:42.943262 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerDied","Data":"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac"} Dec 05 08:50:43 crc kubenswrapper[4795]: I1205 08:50:43.957605 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerStarted","Data":"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f"} Dec 05 08:50:43 crc kubenswrapper[4795]: I1205 08:50:43.988139 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-57bs5" podStartSLOduration=4.531397687 podStartE2EDuration="7.988118576s" podCreationTimestamp="2025-12-05 08:50:36 +0000 UTC" firstStartedPulling="2025-12-05 08:50:39.91507658 +0000 UTC m=+1591.487680359" lastFinishedPulling="2025-12-05 08:50:43.371797469 +0000 UTC m=+1594.944401248" observedRunningTime="2025-12-05 08:50:43.978128286 +0000 UTC m=+1595.550732025" watchObservedRunningTime="2025-12-05 08:50:43.988118576 +0000 UTC m=+1595.560722315" Dec 05 08:50:46 crc kubenswrapper[4795]: I1205 08:50:46.957814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:46 crc kubenswrapper[4795]: I1205 08:50:46.958572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:48 crc kubenswrapper[4795]: I1205 08:50:48.015951 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-57bs5" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="registry-server" probeResult="failure" output=< Dec 05 08:50:48 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 08:50:48 crc kubenswrapper[4795]: > Dec 05 08:50:50 crc kubenswrapper[4795]: I1205 08:50:50.153045 4795 scope.go:117] "RemoveContainer" containerID="a19b2ed86ada4f3fd6d8dc18a552df4ab151f3651f6e0c08cf9d9885ead1f563" Dec 05 08:50:51 crc kubenswrapper[4795]: I1205 08:50:51.054257 4795 generic.go:334] "Generic (PLEG): container finished" podID="62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" containerID="84c82ec65ca1b9971bb341d8a22a0605a80b7d6b049669f7bcb8ec4052b233a2" exitCode=0 Dec 05 08:50:51 crc kubenswrapper[4795]: I1205 08:50:51.054868 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" event={"ID":"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc","Type":"ContainerDied","Data":"84c82ec65ca1b9971bb341d8a22a0605a80b7d6b049669f7bcb8ec4052b233a2"} Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.588335 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.712389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory\") pod \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.712939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5rwr\" (UniqueName: \"kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr\") pod \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.713235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle\") pod \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.713485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key\") pod \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\" (UID: \"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc\") " Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.723262 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" (UID: "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.731251 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr" (OuterVolumeSpecName: "kube-api-access-h5rwr") pod "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" (UID: "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc"). InnerVolumeSpecName "kube-api-access-h5rwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.760467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory" (OuterVolumeSpecName: "inventory") pod "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" (UID: "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.775974 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" (UID: "62af26c9-a1a2-43e6-9f1e-0ea9e48042bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.816311 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5rwr\" (UniqueName: \"kubernetes.io/projected/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-kube-api-access-h5rwr\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.816362 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.816375 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:52 crc kubenswrapper[4795]: I1205 08:50:52.816386 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62af26c9-a1a2-43e6-9f1e-0ea9e48042bc-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.086670 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" event={"ID":"62af26c9-a1a2-43e6-9f1e-0ea9e48042bc","Type":"ContainerDied","Data":"b9148532c7abfa3be3abf42385edaa217d891a74750827de6a0aef60bc046ae2"} Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.086724 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9148532c7abfa3be3abf42385edaa217d891a74750827de6a0aef60bc046ae2" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.086832 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd" Dec 05 08:50:53 crc kubenswrapper[4795]: E1205 08:50:53.153034 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62af26c9_a1a2_43e6_9f1e_0ea9e48042bc.slice/crio-b9148532c7abfa3be3abf42385edaa217d891a74750827de6a0aef60bc046ae2\": RecentStats: unable to find data in memory cache]" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.218592 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4"] Dec 05 08:50:53 crc kubenswrapper[4795]: E1205 08:50:53.219052 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.219072 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.219288 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62af26c9-a1a2-43e6-9f1e-0ea9e48042bc" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.220025 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.227172 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.229000 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.229146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.229152 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.235388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4"] Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.332564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv495\" (UniqueName: \"kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.332701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.332749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.434731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv495\" (UniqueName: \"kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.434818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.434854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.441702 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.442956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.457044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv495\" (UniqueName: \"kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-pvmj4\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:53 crc kubenswrapper[4795]: I1205 08:50:53.546037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:54 crc kubenswrapper[4795]: I1205 08:50:54.155845 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4"] Dec 05 08:50:55 crc kubenswrapper[4795]: I1205 08:50:55.113590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" event={"ID":"e83a9af4-07e8-4f0e-a764-19e3f093fb2a","Type":"ContainerStarted","Data":"5fa1185bebac8c3b4a0dcd7a85d037124f11d1831d0896b483882814a79eef5a"} Dec 05 08:50:55 crc kubenswrapper[4795]: I1205 08:50:55.114441 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" event={"ID":"e83a9af4-07e8-4f0e-a764-19e3f093fb2a","Type":"ContainerStarted","Data":"e6b1ecaf809c87b9650c74862f24d94b8382c2c70f7ea98e1e0173f95760079e"} Dec 05 08:50:55 crc kubenswrapper[4795]: I1205 08:50:55.134388 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" podStartSLOduration=1.954440797 podStartE2EDuration="2.134364672s" podCreationTimestamp="2025-12-05 08:50:53 +0000 UTC" firstStartedPulling="2025-12-05 08:50:54.157596258 +0000 UTC m=+1605.730200007" lastFinishedPulling="2025-12-05 08:50:54.337520143 +0000 UTC m=+1605.910123882" observedRunningTime="2025-12-05 08:50:55.131763372 +0000 UTC m=+1606.704367121" watchObservedRunningTime="2025-12-05 08:50:55.134364672 +0000 UTC m=+1606.706968421" Dec 05 08:50:57 crc kubenswrapper[4795]: I1205 08:50:57.016352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:57 crc kubenswrapper[4795]: I1205 08:50:57.082880 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:57 crc kubenswrapper[4795]: I1205 08:50:57.272112 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.154638 4795 generic.go:334] "Generic (PLEG): container finished" podID="e83a9af4-07e8-4f0e-a764-19e3f093fb2a" containerID="5fa1185bebac8c3b4a0dcd7a85d037124f11d1831d0896b483882814a79eef5a" exitCode=0 Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.155344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-57bs5" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="registry-server" containerID="cri-o://cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f" gracePeriod=2 Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.154764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" event={"ID":"e83a9af4-07e8-4f0e-a764-19e3f093fb2a","Type":"ContainerDied","Data":"5fa1185bebac8c3b4a0dcd7a85d037124f11d1831d0896b483882814a79eef5a"} Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.714959 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.872270 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jkm\" (UniqueName: \"kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm\") pod \"820af2df-905b-43b4-b1ff-a1af84cc33e7\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.872493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities\") pod \"820af2df-905b-43b4-b1ff-a1af84cc33e7\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.872629 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content\") pod \"820af2df-905b-43b4-b1ff-a1af84cc33e7\" (UID: \"820af2df-905b-43b4-b1ff-a1af84cc33e7\") " Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.873676 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities" (OuterVolumeSpecName: "utilities") pod "820af2df-905b-43b4-b1ff-a1af84cc33e7" (UID: "820af2df-905b-43b4-b1ff-a1af84cc33e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.886977 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm" (OuterVolumeSpecName: "kube-api-access-w7jkm") pod "820af2df-905b-43b4-b1ff-a1af84cc33e7" (UID: "820af2df-905b-43b4-b1ff-a1af84cc33e7"). InnerVolumeSpecName "kube-api-access-w7jkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.925478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "820af2df-905b-43b4-b1ff-a1af84cc33e7" (UID: "820af2df-905b-43b4-b1ff-a1af84cc33e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.975798 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.975836 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820af2df-905b-43b4-b1ff-a1af84cc33e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:58 crc kubenswrapper[4795]: I1205 08:50:58.975869 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jkm\" (UniqueName: \"kubernetes.io/projected/820af2df-905b-43b4-b1ff-a1af84cc33e7-kube-api-access-w7jkm\") on node \"crc\" DevicePath \"\"" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.174024 4795 generic.go:334] "Generic (PLEG): container finished" podID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerID="cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f" exitCode=0 Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.174092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerDied","Data":"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f"} Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.174200 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57bs5" event={"ID":"820af2df-905b-43b4-b1ff-a1af84cc33e7","Type":"ContainerDied","Data":"9d85af0a58631ca37f355b5d9e208e5fe820490d14243b455a3f45b6863cb67f"} Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.174237 4795 scope.go:117] "RemoveContainer" containerID="cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.175772 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57bs5" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.227815 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.238043 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-57bs5"] Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.247015 4795 scope.go:117] "RemoveContainer" containerID="3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.297103 4795 scope.go:117] "RemoveContainer" containerID="5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.347906 4795 scope.go:117] "RemoveContainer" containerID="cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f" Dec 05 08:50:59 crc kubenswrapper[4795]: E1205 08:50:59.348589 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f\": container with ID starting with cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f not found: ID does not exist" containerID="cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.348663 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f"} err="failed to get container status \"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f\": rpc error: code = NotFound desc = could not find container \"cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f\": container with ID starting with cfd0a9c4e1bcb9018780210a649618df308e24801130c3c4250e1472724b968f not found: ID does not exist" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.348739 4795 scope.go:117] "RemoveContainer" containerID="3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac" Dec 05 08:50:59 crc kubenswrapper[4795]: E1205 08:50:59.349231 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac\": container with ID starting with 3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac not found: ID does not exist" containerID="3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.349275 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac"} err="failed to get container status \"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac\": rpc error: code = NotFound desc = could not find container \"3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac\": container with ID starting with 3ce69e2444fa784ee5e205955bf0988b3dd975a79cb18f7cfc12b021728edbac not found: ID does not exist" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.349318 4795 scope.go:117] "RemoveContainer" containerID="5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25" Dec 05 08:50:59 crc kubenswrapper[4795]: E1205 08:50:59.349999 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25\": container with ID starting with 5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25 not found: ID does not exist" containerID="5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.350061 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25"} err="failed to get container status \"5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25\": rpc error: code = NotFound desc = could not find container \"5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25\": container with ID starting with 5433e1b586773b4aa81fb8c897cb0a0976dca3ced4b2a3ad1bbf7a96bf1b8e25 not found: ID does not exist" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.721022 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.899458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory\") pod \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.899743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv495\" (UniqueName: \"kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495\") pod \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.899997 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key\") pod \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\" (UID: \"e83a9af4-07e8-4f0e-a764-19e3f093fb2a\") " Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.906982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495" (OuterVolumeSpecName: "kube-api-access-gv495") pod "e83a9af4-07e8-4f0e-a764-19e3f093fb2a" (UID: "e83a9af4-07e8-4f0e-a764-19e3f093fb2a"). InnerVolumeSpecName "kube-api-access-gv495". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.935587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory" (OuterVolumeSpecName: "inventory") pod "e83a9af4-07e8-4f0e-a764-19e3f093fb2a" (UID: "e83a9af4-07e8-4f0e-a764-19e3f093fb2a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:50:59 crc kubenswrapper[4795]: I1205 08:50:59.936791 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e83a9af4-07e8-4f0e-a764-19e3f093fb2a" (UID: "e83a9af4-07e8-4f0e-a764-19e3f093fb2a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.002942 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.002997 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.003013 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv495\" (UniqueName: \"kubernetes.io/projected/e83a9af4-07e8-4f0e-a764-19e3f093fb2a-kube-api-access-gv495\") on node \"crc\" DevicePath \"\"" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.187570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" event={"ID":"e83a9af4-07e8-4f0e-a764-19e3f093fb2a","Type":"ContainerDied","Data":"e6b1ecaf809c87b9650c74862f24d94b8382c2c70f7ea98e1e0173f95760079e"} Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.187683 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6b1ecaf809c87b9650c74862f24d94b8382c2c70f7ea98e1e0173f95760079e" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.187835 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-pvmj4" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.349791 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd"] Dec 05 08:51:00 crc kubenswrapper[4795]: E1205 08:51:00.350818 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="extract-utilities" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.350907 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="extract-utilities" Dec 05 08:51:00 crc kubenswrapper[4795]: E1205 08:51:00.350973 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="extract-content" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.351023 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="extract-content" Dec 05 08:51:00 crc kubenswrapper[4795]: E1205 08:51:00.351139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="registry-server" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.351196 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="registry-server" Dec 05 08:51:00 crc kubenswrapper[4795]: E1205 08:51:00.351267 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83a9af4-07e8-4f0e-a764-19e3f093fb2a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.351324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83a9af4-07e8-4f0e-a764-19e3f093fb2a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.351581 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83a9af4-07e8-4f0e-a764-19e3f093fb2a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.351701 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" containerName="registry-server" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.352535 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.357540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.357831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.358032 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.358178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.364703 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd"] Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.518836 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.518967 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.519652 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knfkd\" (UniqueName: \"kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.519746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.621372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.621446 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.621552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knfkd\" (UniqueName: \"kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.621578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.626879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.626977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.627533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.653544 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knfkd\" (UniqueName: \"kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.696786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:51:00 crc kubenswrapper[4795]: I1205 08:51:00.778347 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820af2df-905b-43b4-b1ff-a1af84cc33e7" path="/var/lib/kubelet/pods/820af2df-905b-43b4-b1ff-a1af84cc33e7/volumes" Dec 05 08:51:01 crc kubenswrapper[4795]: I1205 08:51:01.080514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd"] Dec 05 08:51:01 crc kubenswrapper[4795]: I1205 08:51:01.206241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" event={"ID":"5e42d8c8-afcc-4c91-a967-5aac94f29019","Type":"ContainerStarted","Data":"fdbfc8bed1397f7d24894f6cdf696800922a888adde486546929b4255697c562"} Dec 05 08:51:02 crc kubenswrapper[4795]: I1205 08:51:02.222811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" event={"ID":"5e42d8c8-afcc-4c91-a967-5aac94f29019","Type":"ContainerStarted","Data":"11f3bf873250c12c5e611a55bc0bdeb87049c9cc5fe656afb4988e7c734c132a"} Dec 05 08:51:02 crc kubenswrapper[4795]: I1205 08:51:02.256787 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" podStartSLOduration=2.0502880980000002 podStartE2EDuration="2.256758732s" podCreationTimestamp="2025-12-05 08:51:00 +0000 UTC" firstStartedPulling="2025-12-05 08:51:01.081791228 +0000 UTC m=+1612.654394967" lastFinishedPulling="2025-12-05 08:51:01.288261862 +0000 UTC m=+1612.860865601" observedRunningTime="2025-12-05 08:51:02.246271388 +0000 UTC m=+1613.818875157" watchObservedRunningTime="2025-12-05 08:51:02.256758732 +0000 UTC m=+1613.829362511" Dec 05 08:51:10 crc kubenswrapper[4795]: I1205 08:51:10.827457 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:51:10 crc kubenswrapper[4795]: I1205 08:51:10.827982 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:51:40 crc kubenswrapper[4795]: I1205 08:51:40.827185 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:51:40 crc kubenswrapper[4795]: I1205 08:51:40.828378 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:52:10 crc kubenswrapper[4795]: I1205 08:52:10.827498 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:52:10 crc kubenswrapper[4795]: I1205 08:52:10.828290 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 08:52:10 crc kubenswrapper[4795]: I1205 08:52:10.828352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 08:52:10 crc kubenswrapper[4795]: I1205 08:52:10.829222 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 08:52:10 crc kubenswrapper[4795]: I1205 08:52:10.829285 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" gracePeriod=600 Dec 05 08:52:10 crc kubenswrapper[4795]: E1205 08:52:10.957072 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:52:11 crc kubenswrapper[4795]: I1205 08:52:11.069831 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" exitCode=0 Dec 05 08:52:11 crc kubenswrapper[4795]: I1205 08:52:11.069904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca"} Dec 05 08:52:11 crc kubenswrapper[4795]: I1205 08:52:11.069974 4795 scope.go:117] "RemoveContainer" containerID="d86e89d94962757844e50b0a42fc344e8a17a880839200160f5350ade5d60002" Dec 05 08:52:11 crc kubenswrapper[4795]: I1205 08:52:11.072357 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:52:11 crc kubenswrapper[4795]: E1205 08:52:11.072869 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:52:24 crc kubenswrapper[4795]: I1205 08:52:24.748626 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:52:24 crc kubenswrapper[4795]: E1205 08:52:24.749551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:52:38 crc kubenswrapper[4795]: I1205 08:52:38.757337 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:52:38 crc kubenswrapper[4795]: E1205 08:52:38.758250 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.076355 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b449-account-create-update-8qjgb"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.097131 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-khgjk"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.112120 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-6qbk2"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.121680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5042-account-create-update-zvtfw"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.131781 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-71b3-account-create-update-hwtdv"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.139972 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-9lvnd"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.149630 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-71b3-account-create-update-hwtdv"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.158643 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-khgjk"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.166871 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-9lvnd"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.177177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5042-account-create-update-zvtfw"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.186248 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b449-account-create-update-8qjgb"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.197121 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-6qbk2"] Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.780192 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23b22dd0-9f53-4a9f-a431-3de3d43d1e14" path="/var/lib/kubelet/pods/23b22dd0-9f53-4a9f-a431-3de3d43d1e14/volumes" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.782904 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b0b091-d9e3-4203-a73f-bd38fe4105f8" path="/var/lib/kubelet/pods/32b0b091-d9e3-4203-a73f-bd38fe4105f8/volumes" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.784775 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b485181e-1817-4475-82e6-fa5705a822c1" path="/var/lib/kubelet/pods/b485181e-1817-4475-82e6-fa5705a822c1/volumes" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.786045 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd94cdc2-bb5f-423f-b2d6-5e83c050cd07" path="/var/lib/kubelet/pods/bd94cdc2-bb5f-423f-b2d6-5e83c050cd07/volumes" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.787192 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3d3a00-5afb-42a5-921c-e7e77afe8a7b" path="/var/lib/kubelet/pods/bf3d3a00-5afb-42a5-921c-e7e77afe8a7b/volumes" Dec 05 08:52:52 crc kubenswrapper[4795]: I1205 08:52:52.789680 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71c1756-d33b-48eb-b3e5-9298a6ba19e0" path="/var/lib/kubelet/pods/c71c1756-d33b-48eb-b3e5-9298a6ba19e0/volumes" Dec 05 08:52:53 crc kubenswrapper[4795]: I1205 08:52:53.748332 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:52:53 crc kubenswrapper[4795]: E1205 08:52:53.749298 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:53:05 crc kubenswrapper[4795]: I1205 08:53:05.747917 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:53:05 crc kubenswrapper[4795]: E1205 08:53:05.750424 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:53:17 crc kubenswrapper[4795]: I1205 08:53:17.747401 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:53:17 crc kubenswrapper[4795]: E1205 08:53:17.748863 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:53:28 crc kubenswrapper[4795]: I1205 08:53:28.064789 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wklbl"] Dec 05 08:53:28 crc kubenswrapper[4795]: I1205 08:53:28.074222 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wklbl"] Dec 05 08:53:28 crc kubenswrapper[4795]: I1205 08:53:28.767779 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc84840-1b4e-4838-8083-61c785bec8a2" path="/var/lib/kubelet/pods/9fc84840-1b4e-4838-8083-61c785bec8a2/volumes" Dec 05 08:53:31 crc kubenswrapper[4795]: I1205 08:53:31.751108 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:53:31 crc kubenswrapper[4795]: E1205 08:53:31.751969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.050955 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a5de-account-create-update-wzl2s"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.060527 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2mxzn"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.073004 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7bt8k"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.085411 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b3d5-account-create-update-cmx25"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.099119 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-7zmc2"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.110355 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7bt8k"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.119183 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a5de-account-create-update-wzl2s"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.127127 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b3d5-account-create-update-cmx25"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.134678 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2mxzn"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.141707 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9f8c-account-create-update-rc7kk"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.148712 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-7zmc2"] Dec 05 08:53:41 crc kubenswrapper[4795]: I1205 08:53:41.155790 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9f8c-account-create-update-rc7kk"] Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.748016 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:53:42 crc kubenswrapper[4795]: E1205 08:53:42.748788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.769021 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e2b20c8-28b6-4f45-825b-0452d27fa54a" path="/var/lib/kubelet/pods/1e2b20c8-28b6-4f45-825b-0452d27fa54a/volumes" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.772457 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267818a6-5bea-4056-b190-a38341b02b4c" path="/var/lib/kubelet/pods/267818a6-5bea-4056-b190-a38341b02b4c/volumes" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.776111 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3340f491-333a-4403-8be5-0010ad46ece2" path="/var/lib/kubelet/pods/3340f491-333a-4403-8be5-0010ad46ece2/volumes" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.779453 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47061433-b600-4907-8baf-77eb23065955" path="/var/lib/kubelet/pods/47061433-b600-4907-8baf-77eb23065955/volumes" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.784515 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c781e8-96e9-48d0-b837-7d63996efd39" path="/var/lib/kubelet/pods/b6c781e8-96e9-48d0-b837-7d63996efd39/volumes" Dec 05 08:53:42 crc kubenswrapper[4795]: I1205 08:53:42.786929 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18679a7-6c6c-464c-92f0-893963f6a994" path="/var/lib/kubelet/pods/e18679a7-6c6c-464c-92f0-893963f6a994/volumes" Dec 05 08:53:47 crc kubenswrapper[4795]: I1205 08:53:47.049151 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-w5h4x"] Dec 05 08:53:47 crc kubenswrapper[4795]: I1205 08:53:47.075674 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-w5h4x"] Dec 05 08:53:48 crc kubenswrapper[4795]: I1205 08:53:48.765605 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff5addc-f5af-4fac-b0ff-d5f34d238e69" path="/var/lib/kubelet/pods/3ff5addc-f5af-4fac-b0ff-d5f34d238e69/volumes" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.320581 4795 scope.go:117] "RemoveContainer" containerID="e1b168a69727606f44f05e214b8152cd080062dcf89e7dd9552572295767d625" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.355242 4795 scope.go:117] "RemoveContainer" containerID="1ffd14242c1ab51541bd6266a5d2a2dcee44dcb3a851f7ee71b7490b377f1cd3" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.430093 4795 scope.go:117] "RemoveContainer" containerID="507426afa98f2c41056331e786e80503871054ca1ddc6d4a8a63a011306490dd" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.500658 4795 scope.go:117] "RemoveContainer" containerID="6f80e22a373a0c17b426ec9109bf61ff04782a7078bef7ec3d08ea87c6f7b1b0" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.560003 4795 scope.go:117] "RemoveContainer" containerID="e0fa72de53dc1ad92f9cb5cb6285c67be3685979872e0395ea1406815604c7ea" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.592719 4795 scope.go:117] "RemoveContainer" containerID="6b44e81296b76a731cca8a672c14c4685f5d418e47ecd16251cbe2ac0016fc53" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.655470 4795 scope.go:117] "RemoveContainer" containerID="24a16bab7e899618ec2f74f7091be078680e9df1923def1e5f1eaca842e82c1d" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.697048 4795 scope.go:117] "RemoveContainer" containerID="a840631bdd0a34a1e1595c6a48fc21300a16140e8d0dabb6cde42a48126b8bb8" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.721963 4795 scope.go:117] "RemoveContainer" containerID="4fd2d8dd61ec680a52d697af2f536aad4617a91d89ae6455fda03b8a472bfc6b" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.748070 4795 scope.go:117] "RemoveContainer" containerID="b836be2049535bb20686372de242ae824d9b525c2d02bd33b6c0105a402e063f" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.786418 4795 scope.go:117] "RemoveContainer" containerID="fa2700636dfb09e55986f1c309d9767ed73b7b1452490411afe1d84e92a716e8" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.814202 4795 scope.go:117] "RemoveContainer" containerID="6cdf1e0af36fd9920382b1be51940ff8373451a22a65edd9d861f3c8be4638ae" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.836375 4795 scope.go:117] "RemoveContainer" containerID="894be04402414d23cd3d7ef4faab946243b79939d8d1ebb8f7875e2424a4b193" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.867160 4795 scope.go:117] "RemoveContainer" containerID="fd5427a200add60d5a335aa938c2d568a67b57e25bfc600f2cb942814225659f" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.907165 4795 scope.go:117] "RemoveContainer" containerID="b598d3b7932243c20810b6e4bc88a2002d7aea4ca0187c81a917ed10053025ba" Dec 05 08:53:50 crc kubenswrapper[4795]: I1205 08:53:50.930777 4795 scope.go:117] "RemoveContainer" containerID="8eaa2126e49c8d58b1b9e86420333c7fe7f751a020a577786bbb5d8cb7ed9055" Dec 05 08:53:55 crc kubenswrapper[4795]: I1205 08:53:55.747083 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:53:55 crc kubenswrapper[4795]: E1205 08:53:55.749898 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:54:09 crc kubenswrapper[4795]: I1205 08:54:09.748241 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:54:09 crc kubenswrapper[4795]: E1205 08:54:09.749234 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:54:20 crc kubenswrapper[4795]: I1205 08:54:20.748395 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:54:20 crc kubenswrapper[4795]: E1205 08:54:20.750812 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:54:28 crc kubenswrapper[4795]: I1205 08:54:28.052110 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-bxt72"] Dec 05 08:54:28 crc kubenswrapper[4795]: I1205 08:54:28.060390 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-bxt72"] Dec 05 08:54:28 crc kubenswrapper[4795]: I1205 08:54:28.762374 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de95e6b-d594-4ed4-8b8d-041346856347" path="/var/lib/kubelet/pods/7de95e6b-d594-4ed4-8b8d-041346856347/volumes" Dec 05 08:54:31 crc kubenswrapper[4795]: I1205 08:54:31.748768 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:54:31 crc kubenswrapper[4795]: E1205 08:54:31.749969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:54:40 crc kubenswrapper[4795]: I1205 08:54:40.030556 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e42d8c8-afcc-4c91-a967-5aac94f29019" containerID="11f3bf873250c12c5e611a55bc0bdeb87049c9cc5fe656afb4988e7c734c132a" exitCode=0 Dec 05 08:54:40 crc kubenswrapper[4795]: I1205 08:54:40.030730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" event={"ID":"5e42d8c8-afcc-4c91-a967-5aac94f29019","Type":"ContainerDied","Data":"11f3bf873250c12c5e611a55bc0bdeb87049c9cc5fe656afb4988e7c734c132a"} Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.547822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.557097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key\") pod \"5e42d8c8-afcc-4c91-a967-5aac94f29019\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.557171 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle\") pod \"5e42d8c8-afcc-4c91-a967-5aac94f29019\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.557246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory\") pod \"5e42d8c8-afcc-4c91-a967-5aac94f29019\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.557288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knfkd\" (UniqueName: \"kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd\") pod \"5e42d8c8-afcc-4c91-a967-5aac94f29019\" (UID: \"5e42d8c8-afcc-4c91-a967-5aac94f29019\") " Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.577413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5e42d8c8-afcc-4c91-a967-5aac94f29019" (UID: "5e42d8c8-afcc-4c91-a967-5aac94f29019"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.577878 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd" (OuterVolumeSpecName: "kube-api-access-knfkd") pod "5e42d8c8-afcc-4c91-a967-5aac94f29019" (UID: "5e42d8c8-afcc-4c91-a967-5aac94f29019"). InnerVolumeSpecName "kube-api-access-knfkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.614381 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5e42d8c8-afcc-4c91-a967-5aac94f29019" (UID: "5e42d8c8-afcc-4c91-a967-5aac94f29019"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.621192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory" (OuterVolumeSpecName: "inventory") pod "5e42d8c8-afcc-4c91-a967-5aac94f29019" (UID: "5e42d8c8-afcc-4c91-a967-5aac94f29019"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.659286 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.659332 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.659343 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e42d8c8-afcc-4c91-a967-5aac94f29019-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:54:41 crc kubenswrapper[4795]: I1205 08:54:41.659353 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knfkd\" (UniqueName: \"kubernetes.io/projected/5e42d8c8-afcc-4c91-a967-5aac94f29019-kube-api-access-knfkd\") on node \"crc\" DevicePath \"\"" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.059268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" event={"ID":"5e42d8c8-afcc-4c91-a967-5aac94f29019","Type":"ContainerDied","Data":"fdbfc8bed1397f7d24894f6cdf696800922a888adde486546929b4255697c562"} Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.059320 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdbfc8bed1397f7d24894f6cdf696800922a888adde486546929b4255697c562" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.059363 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.207393 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j"] Dec 05 08:54:42 crc kubenswrapper[4795]: E1205 08:54:42.208068 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e42d8c8-afcc-4c91-a967-5aac94f29019" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.208100 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e42d8c8-afcc-4c91-a967-5aac94f29019" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.208532 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e42d8c8-afcc-4c91-a967-5aac94f29019" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.216820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.222812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j"] Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.223193 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.224362 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.224815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.226386 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.274853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.274974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8x8r\" (UniqueName: \"kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.275065 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.376784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.376884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.376960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8x8r\" (UniqueName: \"kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.387207 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.387237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.400167 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8x8r\" (UniqueName: \"kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.549398 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:54:42 crc kubenswrapper[4795]: I1205 08:54:42.748888 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:54:42 crc kubenswrapper[4795]: E1205 08:54:42.749528 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:54:43 crc kubenswrapper[4795]: I1205 08:54:43.184833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j"] Dec 05 08:54:44 crc kubenswrapper[4795]: I1205 08:54:44.081140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" event={"ID":"7eb44bb0-b9bd-4a64-97de-d1c08b927625","Type":"ContainerStarted","Data":"15a752920a1657276b355b186c3736205220f2ee328a6c8ec8bde8ead106a3ff"} Dec 05 08:54:44 crc kubenswrapper[4795]: I1205 08:54:44.081745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" event={"ID":"7eb44bb0-b9bd-4a64-97de-d1c08b927625","Type":"ContainerStarted","Data":"857a9a3a179bda2cf4a318e2469721ba26cf616fd3c04167f76243c950f48929"} Dec 05 08:54:44 crc kubenswrapper[4795]: I1205 08:54:44.105292 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" podStartSLOduration=1.916228359 podStartE2EDuration="2.10526468s" podCreationTimestamp="2025-12-05 08:54:42 +0000 UTC" firstStartedPulling="2025-12-05 08:54:43.207844233 +0000 UTC m=+1834.780447972" lastFinishedPulling="2025-12-05 08:54:43.396880564 +0000 UTC m=+1834.969484293" observedRunningTime="2025-12-05 08:54:44.101014824 +0000 UTC m=+1835.673618563" watchObservedRunningTime="2025-12-05 08:54:44.10526468 +0000 UTC m=+1835.677868419" Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.048625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cg5dx"] Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.059123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-nxkbh"] Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.068949 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cg5dx"] Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.078208 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-nxkbh"] Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.761390 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05e0563-19d6-439b-b9d2-d241537794c4" path="/var/lib/kubelet/pods/b05e0563-19d6-439b-b9d2-d241537794c4/volumes" Dec 05 08:54:48 crc kubenswrapper[4795]: I1205 08:54:48.763586 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87" path="/var/lib/kubelet/pods/d0d5ec62-fc7e-41e9-b8d9-bb07f5b67a87/volumes" Dec 05 08:54:51 crc kubenswrapper[4795]: I1205 08:54:51.318339 4795 scope.go:117] "RemoveContainer" containerID="0da4fdb12c2d95e3f2dde1f024cce45891fc2bf5f879e6a7cc473244c713f96d" Dec 05 08:54:51 crc kubenswrapper[4795]: I1205 08:54:51.380288 4795 scope.go:117] "RemoveContainer" containerID="21ba3b63122aef1bb5134dde6523b67bf9263ad14895e2960724b5a4d48c62a2" Dec 05 08:54:51 crc kubenswrapper[4795]: I1205 08:54:51.469999 4795 scope.go:117] "RemoveContainer" containerID="46fa3526494c87f64a30b69fcb7bba963258964ce3e90db0536aca8a8bd359f6" Dec 05 08:54:54 crc kubenswrapper[4795]: I1205 08:54:54.748851 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:54:54 crc kubenswrapper[4795]: E1205 08:54:54.750302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:55:04 crc kubenswrapper[4795]: I1205 08:55:04.055486 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-cq4gw"] Dec 05 08:55:04 crc kubenswrapper[4795]: I1205 08:55:04.065278 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-cq4gw"] Dec 05 08:55:04 crc kubenswrapper[4795]: I1205 08:55:04.768321 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6ce9d5-263a-4b05-83e5-c349f0038001" path="/var/lib/kubelet/pods/dd6ce9d5-263a-4b05-83e5-c349f0038001/volumes" Dec 05 08:55:05 crc kubenswrapper[4795]: I1205 08:55:05.038327 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-k28kg"] Dec 05 08:55:05 crc kubenswrapper[4795]: I1205 08:55:05.048631 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-k28kg"] Dec 05 08:55:06 crc kubenswrapper[4795]: I1205 08:55:06.762523 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44d62cd1-585f-4756-b3f9-6f0714ea3248" path="/var/lib/kubelet/pods/44d62cd1-585f-4756-b3f9-6f0714ea3248/volumes" Dec 05 08:55:09 crc kubenswrapper[4795]: I1205 08:55:09.748338 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:55:09 crc kubenswrapper[4795]: E1205 08:55:09.749409 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:55:11 crc kubenswrapper[4795]: I1205 08:55:11.167249 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-584d9d78f9-hwfvk" podUID="78ae9e33-4a1a-4296-8b17-65c7775bd5ec" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 05 08:55:22 crc kubenswrapper[4795]: I1205 08:55:22.747474 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:55:22 crc kubenswrapper[4795]: E1205 08:55:22.748559 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:55:35 crc kubenswrapper[4795]: I1205 08:55:35.748858 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:55:35 crc kubenswrapper[4795]: E1205 08:55:35.750513 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:55:50 crc kubenswrapper[4795]: I1205 08:55:50.747319 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:55:50 crc kubenswrapper[4795]: E1205 08:55:50.748344 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:55:51 crc kubenswrapper[4795]: I1205 08:55:51.576933 4795 scope.go:117] "RemoveContainer" containerID="ce80d725ee6315e111101dafb5d32da3aba4e9418ef8dafa2249abbbb6fcc646" Dec 05 08:55:51 crc kubenswrapper[4795]: I1205 08:55:51.610804 4795 scope.go:117] "RemoveContainer" containerID="48f628ae6feab2d2ac76c438de9dfdf7f23719be803014cdf3e16dcbb6e4ad36" Dec 05 08:55:51 crc kubenswrapper[4795]: I1205 08:55:51.663963 4795 scope.go:117] "RemoveContainer" containerID="13bfbf7fe1f191fe24fe49e4647d6e034191d73e1e2574fc207c2987ae14c730" Dec 05 08:55:51 crc kubenswrapper[4795]: I1205 08:55:51.725936 4795 scope.go:117] "RemoveContainer" containerID="3965dbcfed1102d4d5e89c40cecc086a0635b77949a3babca31ce6ffe5799f4a" Dec 05 08:56:02 crc kubenswrapper[4795]: I1205 08:56:02.748832 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:56:02 crc kubenswrapper[4795]: E1205 08:56:02.750224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:56:07 crc kubenswrapper[4795]: I1205 08:56:07.046143 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-x7v2r"] Dec 05 08:56:07 crc kubenswrapper[4795]: I1205 08:56:07.055972 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-x7v2r"] Dec 05 08:56:07 crc kubenswrapper[4795]: I1205 08:56:07.063016 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gfv9h"] Dec 05 08:56:07 crc kubenswrapper[4795]: I1205 08:56:07.072528 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gfv9h"] Dec 05 08:56:08 crc kubenswrapper[4795]: I1205 08:56:08.047324 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3218-account-create-update-8lz4k"] Dec 05 08:56:08 crc kubenswrapper[4795]: I1205 08:56:08.063227 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3218-account-create-update-8lz4k"] Dec 05 08:56:08 crc kubenswrapper[4795]: I1205 08:56:08.794343 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33dfb11f-a33f-463e-ae7a-8f3891042c4d" path="/var/lib/kubelet/pods/33dfb11f-a33f-463e-ae7a-8f3891042c4d/volumes" Dec 05 08:56:08 crc kubenswrapper[4795]: I1205 08:56:08.796885 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dc46818-e0ca-4331-9f50-f01e5f50d812" path="/var/lib/kubelet/pods/6dc46818-e0ca-4331-9f50-f01e5f50d812/volumes" Dec 05 08:56:08 crc kubenswrapper[4795]: I1205 08:56:08.798096 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97250f7-6ac0-48ea-897a-741b1bc97c1d" path="/var/lib/kubelet/pods/c97250f7-6ac0-48ea-897a-741b1bc97c1d/volumes" Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.051185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4e84-account-create-update-tqw8b"] Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.062125 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ed0a-account-create-update-j8w29"] Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.073733 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4e84-account-create-update-tqw8b"] Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.085554 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-2nlv6"] Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.093905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ed0a-account-create-update-j8w29"] Dec 05 08:56:09 crc kubenswrapper[4795]: I1205 08:56:09.104596 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-2nlv6"] Dec 05 08:56:10 crc kubenswrapper[4795]: I1205 08:56:10.776946 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415c9625-8f52-41fc-818f-420b59863110" path="/var/lib/kubelet/pods/415c9625-8f52-41fc-818f-420b59863110/volumes" Dec 05 08:56:10 crc kubenswrapper[4795]: I1205 08:56:10.778971 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="689df0bf-6323-4159-8433-8d916f33abff" path="/var/lib/kubelet/pods/689df0bf-6323-4159-8433-8d916f33abff/volumes" Dec 05 08:56:10 crc kubenswrapper[4795]: I1205 08:56:10.780272 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a704000f-7677-42e5-86cd-4cbc8134785b" path="/var/lib/kubelet/pods/a704000f-7677-42e5-86cd-4cbc8134785b/volumes" Dec 05 08:56:16 crc kubenswrapper[4795]: I1205 08:56:16.748413 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:56:16 crc kubenswrapper[4795]: E1205 08:56:16.749593 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:56:31 crc kubenswrapper[4795]: I1205 08:56:31.747776 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:56:31 crc kubenswrapper[4795]: E1205 08:56:31.748878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:56:45 crc kubenswrapper[4795]: I1205 08:56:45.747971 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:56:45 crc kubenswrapper[4795]: E1205 08:56:45.748965 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:56:51 crc kubenswrapper[4795]: I1205 08:56:51.551218 4795 generic.go:334] "Generic (PLEG): container finished" podID="7eb44bb0-b9bd-4a64-97de-d1c08b927625" containerID="15a752920a1657276b355b186c3736205220f2ee328a6c8ec8bde8ead106a3ff" exitCode=0 Dec 05 08:56:51 crc kubenswrapper[4795]: I1205 08:56:51.551260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" event={"ID":"7eb44bb0-b9bd-4a64-97de-d1c08b927625","Type":"ContainerDied","Data":"15a752920a1657276b355b186c3736205220f2ee328a6c8ec8bde8ead106a3ff"} Dec 05 08:56:51 crc kubenswrapper[4795]: I1205 08:56:51.874338 4795 scope.go:117] "RemoveContainer" containerID="34e91df4f29c5b1464a26a71df42630a156b939dccb1c4734ee5530c32fb8c7d" Dec 05 08:56:51 crc kubenswrapper[4795]: I1205 08:56:51.917292 4795 scope.go:117] "RemoveContainer" containerID="025a3d8731d45f13171e50e3bdda0242427ad278829f9a0635d5732d3fcb88d7" Dec 05 08:56:51 crc kubenswrapper[4795]: I1205 08:56:51.965250 4795 scope.go:117] "RemoveContainer" containerID="b764a5be5f069078ea70c0e1d1e84b6c85c7d4d02fe019d1a8b8c4a27fc10d6f" Dec 05 08:56:52 crc kubenswrapper[4795]: I1205 08:56:52.019147 4795 scope.go:117] "RemoveContainer" containerID="92bd802f405429443aba150f3e2c90044b94f1db8e464eb4ec38757edcbbdef8" Dec 05 08:56:52 crc kubenswrapper[4795]: I1205 08:56:52.058713 4795 scope.go:117] "RemoveContainer" containerID="1a8638debddde0b27d190e26595fa1579fb342e98510f1dd133bc83ea82b0b0b" Dec 05 08:56:52 crc kubenswrapper[4795]: I1205 08:56:52.125526 4795 scope.go:117] "RemoveContainer" containerID="0afe21a5272589853b1a8d25b0d25fb526f7b510a918cef6cfd1dd053c5bef09" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.074194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.131157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8x8r\" (UniqueName: \"kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r\") pod \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.131258 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key\") pod \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.131331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory\") pod \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\" (UID: \"7eb44bb0-b9bd-4a64-97de-d1c08b927625\") " Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.150820 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r" (OuterVolumeSpecName: "kube-api-access-x8x8r") pod "7eb44bb0-b9bd-4a64-97de-d1c08b927625" (UID: "7eb44bb0-b9bd-4a64-97de-d1c08b927625"). InnerVolumeSpecName "kube-api-access-x8x8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.191929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7eb44bb0-b9bd-4a64-97de-d1c08b927625" (UID: "7eb44bb0-b9bd-4a64-97de-d1c08b927625"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.202294 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory" (OuterVolumeSpecName: "inventory") pod "7eb44bb0-b9bd-4a64-97de-d1c08b927625" (UID: "7eb44bb0-b9bd-4a64-97de-d1c08b927625"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.234822 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8x8r\" (UniqueName: \"kubernetes.io/projected/7eb44bb0-b9bd-4a64-97de-d1c08b927625-kube-api-access-x8x8r\") on node \"crc\" DevicePath \"\"" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.234856 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.234870 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eb44bb0-b9bd-4a64-97de-d1c08b927625-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.573877 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" event={"ID":"7eb44bb0-b9bd-4a64-97de-d1c08b927625","Type":"ContainerDied","Data":"857a9a3a179bda2cf4a318e2469721ba26cf616fd3c04167f76243c950f48929"} Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.573927 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="857a9a3a179bda2cf4a318e2469721ba26cf616fd3c04167f76243c950f48929" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.574005 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.706773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg"] Dec 05 08:56:53 crc kubenswrapper[4795]: E1205 08:56:53.707283 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb44bb0-b9bd-4a64-97de-d1c08b927625" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.707304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb44bb0-b9bd-4a64-97de-d1c08b927625" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.707482 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb44bb0-b9bd-4a64-97de-d1c08b927625" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.708435 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.712835 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.713223 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.713373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.714489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.736776 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg"] Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.762942 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.763017 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6bvb\" (UniqueName: \"kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.763098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.866178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.866898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.866940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6bvb\" (UniqueName: \"kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.875797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.882231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:53 crc kubenswrapper[4795]: I1205 08:56:53.893075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6bvb\" (UniqueName: \"kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:54 crc kubenswrapper[4795]: I1205 08:56:54.028885 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:56:54 crc kubenswrapper[4795]: I1205 08:56:54.898283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg"] Dec 05 08:56:54 crc kubenswrapper[4795]: I1205 08:56:54.906518 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 08:56:55 crc kubenswrapper[4795]: I1205 08:56:55.603894 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" event={"ID":"fdfda940-145b-497f-8adc-d001a4f852ed","Type":"ContainerStarted","Data":"e54c1eaefc731d7f1fb6d7b03e9d0c7821b45c16bb629fc6f6bededc8a2a36ac"} Dec 05 08:56:55 crc kubenswrapper[4795]: I1205 08:56:55.605098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" event={"ID":"fdfda940-145b-497f-8adc-d001a4f852ed","Type":"ContainerStarted","Data":"b0480bd6d8424f875278dd3cf5ba19ddd6f797bb921a6940c349b5d21ffc9c36"} Dec 05 08:56:55 crc kubenswrapper[4795]: I1205 08:56:55.630030 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" podStartSLOduration=2.427830509 podStartE2EDuration="2.629988291s" podCreationTimestamp="2025-12-05 08:56:53 +0000 UTC" firstStartedPulling="2025-12-05 08:56:54.906228269 +0000 UTC m=+1966.478832008" lastFinishedPulling="2025-12-05 08:56:55.108386051 +0000 UTC m=+1966.680989790" observedRunningTime="2025-12-05 08:56:55.621541467 +0000 UTC m=+1967.194145206" watchObservedRunningTime="2025-12-05 08:56:55.629988291 +0000 UTC m=+1967.202592030" Dec 05 08:56:57 crc kubenswrapper[4795]: I1205 08:56:57.748702 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:56:57 crc kubenswrapper[4795]: E1205 08:56:57.749745 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:57:02 crc kubenswrapper[4795]: I1205 08:57:02.067908 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-js4j6"] Dec 05 08:57:02 crc kubenswrapper[4795]: I1205 08:57:02.080596 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-js4j6"] Dec 05 08:57:02 crc kubenswrapper[4795]: I1205 08:57:02.759542 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72fc3705-c3fb-494a-8dc3-9949853a7c1a" path="/var/lib/kubelet/pods/72fc3705-c3fb-494a-8dc3-9949853a7c1a/volumes" Dec 05 08:57:08 crc kubenswrapper[4795]: I1205 08:57:08.761454 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:57:08 crc kubenswrapper[4795]: E1205 08:57:08.763105 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 08:57:19 crc kubenswrapper[4795]: I1205 08:57:19.752120 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 08:57:20 crc kubenswrapper[4795]: I1205 08:57:20.906914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255"} Dec 05 08:57:35 crc kubenswrapper[4795]: I1205 08:57:35.058874 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-ltnvx"] Dec 05 08:57:35 crc kubenswrapper[4795]: I1205 08:57:35.072656 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-ltnvx"] Dec 05 08:57:36 crc kubenswrapper[4795]: I1205 08:57:36.773024 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bcede8-6fd6-409e-83ef-306a7912dc7f" path="/var/lib/kubelet/pods/f2bcede8-6fd6-409e-83ef-306a7912dc7f/volumes" Dec 05 08:57:44 crc kubenswrapper[4795]: I1205 08:57:44.044640 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4qr2p"] Dec 05 08:57:44 crc kubenswrapper[4795]: I1205 08:57:44.057372 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-4qr2p"] Dec 05 08:57:44 crc kubenswrapper[4795]: I1205 08:57:44.764676 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d614be9-37fb-439a-895d-eb5c92210497" path="/var/lib/kubelet/pods/4d614be9-37fb-439a-895d-eb5c92210497/volumes" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.483353 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.486524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.503057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.642782 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcgzs\" (UniqueName: \"kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.642876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.642943 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.745563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.745691 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.745803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcgzs\" (UniqueName: \"kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.746340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.746426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.769197 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcgzs\" (UniqueName: \"kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs\") pod \"community-operators-b5g9w\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:46 crc kubenswrapper[4795]: I1205 08:57:46.858336 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:47 crc kubenswrapper[4795]: W1205 08:57:47.509590 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f1f421c_0395_44a6_b922_02affb104c0f.slice/crio-76fd02df33cd084bd20c8b1295a87c7a0d67beecb1d827cf12289a4716c43567 WatchSource:0}: Error finding container 76fd02df33cd084bd20c8b1295a87c7a0d67beecb1d827cf12289a4716c43567: Status 404 returned error can't find the container with id 76fd02df33cd084bd20c8b1295a87c7a0d67beecb1d827cf12289a4716c43567 Dec 05 08:57:47 crc kubenswrapper[4795]: I1205 08:57:47.510225 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.230219 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f1f421c-0395-44a6-b922-02affb104c0f" containerID="cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251" exitCode=0 Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.230345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerDied","Data":"cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251"} Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.230949 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerStarted","Data":"76fd02df33cd084bd20c8b1295a87c7a0d67beecb1d827cf12289a4716c43567"} Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.887908 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.891155 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.911588 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.998409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.999118 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqq9\" (UniqueName: \"kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:48 crc kubenswrapper[4795]: I1205 08:57:48.999660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.102210 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.102282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.102316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqq9\" (UniqueName: \"kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.102964 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.103043 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.134443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqq9\" (UniqueName: \"kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9\") pod \"redhat-marketplace-d2n79\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.233754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.245102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerStarted","Data":"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3"} Dec 05 08:57:49 crc kubenswrapper[4795]: I1205 08:57:49.916332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:57:50 crc kubenswrapper[4795]: I1205 08:57:50.290113 4795 generic.go:334] "Generic (PLEG): container finished" podID="04277004-cab9-4a24-b2fc-7a7383138f18" containerID="423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de" exitCode=0 Dec 05 08:57:50 crc kubenswrapper[4795]: I1205 08:57:50.290304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerDied","Data":"423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de"} Dec 05 08:57:50 crc kubenswrapper[4795]: I1205 08:57:50.290806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerStarted","Data":"6d820275bcff1b52e1989d536cc10b07e87b1eab4b6f8ee509aadd562d85cf98"} Dec 05 08:57:51 crc kubenswrapper[4795]: I1205 08:57:51.311894 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f1f421c-0395-44a6-b922-02affb104c0f" containerID="d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3" exitCode=0 Dec 05 08:57:51 crc kubenswrapper[4795]: I1205 08:57:51.311979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerDied","Data":"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3"} Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.296636 4795 scope.go:117] "RemoveContainer" containerID="503704dd0c63698020dbe8d1d29b72e242e762697425ffe87457e0153bc3b70b" Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.360947 4795 generic.go:334] "Generic (PLEG): container finished" podID="04277004-cab9-4a24-b2fc-7a7383138f18" containerID="6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca" exitCode=0 Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.361113 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerDied","Data":"6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca"} Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.370253 4795 scope.go:117] "RemoveContainer" containerID="10cb78d4de0eb2cead73b25d4b22b2aa00b82621ad1e9971d37f2c0171c767a1" Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.374625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerStarted","Data":"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df"} Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.431742 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b5g9w" podStartSLOduration=2.630676722 podStartE2EDuration="6.431699222s" podCreationTimestamp="2025-12-05 08:57:46 +0000 UTC" firstStartedPulling="2025-12-05 08:57:48.235026339 +0000 UTC m=+2019.807630078" lastFinishedPulling="2025-12-05 08:57:52.036048839 +0000 UTC m=+2023.608652578" observedRunningTime="2025-12-05 08:57:52.417111905 +0000 UTC m=+2023.989715644" watchObservedRunningTime="2025-12-05 08:57:52.431699222 +0000 UTC m=+2024.004302961" Dec 05 08:57:52 crc kubenswrapper[4795]: I1205 08:57:52.445241 4795 scope.go:117] "RemoveContainer" containerID="83bd713295be536c586766caaae00dfd52bb90420b425a3c3e0a75afdc216cbe" Dec 05 08:57:53 crc kubenswrapper[4795]: I1205 08:57:53.385189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerStarted","Data":"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438"} Dec 05 08:57:53 crc kubenswrapper[4795]: I1205 08:57:53.413494 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2n79" podStartSLOduration=2.942981044 podStartE2EDuration="5.413474709s" podCreationTimestamp="2025-12-05 08:57:48 +0000 UTC" firstStartedPulling="2025-12-05 08:57:50.293329172 +0000 UTC m=+2021.865932911" lastFinishedPulling="2025-12-05 08:57:52.763822837 +0000 UTC m=+2024.336426576" observedRunningTime="2025-12-05 08:57:53.411470657 +0000 UTC m=+2024.984074406" watchObservedRunningTime="2025-12-05 08:57:53.413474709 +0000 UTC m=+2024.986078448" Dec 05 08:57:56 crc kubenswrapper[4795]: I1205 08:57:56.859499 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:56 crc kubenswrapper[4795]: I1205 08:57:56.860709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:56 crc kubenswrapper[4795]: I1205 08:57:56.914774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:57 crc kubenswrapper[4795]: I1205 08:57:57.513795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:57:58 crc kubenswrapper[4795]: I1205 08:57:58.275935 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.235557 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.236317 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.320965 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.479449 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b5g9w" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="registry-server" containerID="cri-o://a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df" gracePeriod=2 Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.545599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:57:59 crc kubenswrapper[4795]: I1205 08:57:59.955970 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.060783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities\") pod \"9f1f421c-0395-44a6-b922-02affb104c0f\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.060885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcgzs\" (UniqueName: \"kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs\") pod \"9f1f421c-0395-44a6-b922-02affb104c0f\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.061108 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content\") pod \"9f1f421c-0395-44a6-b922-02affb104c0f\" (UID: \"9f1f421c-0395-44a6-b922-02affb104c0f\") " Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.062017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities" (OuterVolumeSpecName: "utilities") pod "9f1f421c-0395-44a6-b922-02affb104c0f" (UID: "9f1f421c-0395-44a6-b922-02affb104c0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.069964 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs" (OuterVolumeSpecName: "kube-api-access-qcgzs") pod "9f1f421c-0395-44a6-b922-02affb104c0f" (UID: "9f1f421c-0395-44a6-b922-02affb104c0f"). InnerVolumeSpecName "kube-api-access-qcgzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.126945 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f1f421c-0395-44a6-b922-02affb104c0f" (UID: "9f1f421c-0395-44a6-b922-02affb104c0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.163898 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.164353 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1f421c-0395-44a6-b922-02affb104c0f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.164418 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcgzs\" (UniqueName: \"kubernetes.io/projected/9f1f421c-0395-44a6-b922-02affb104c0f-kube-api-access-qcgzs\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.490341 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f1f421c-0395-44a6-b922-02affb104c0f" containerID="a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df" exitCode=0 Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.490725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerDied","Data":"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df"} Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.490862 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5g9w" event={"ID":"9f1f421c-0395-44a6-b922-02affb104c0f","Type":"ContainerDied","Data":"76fd02df33cd084bd20c8b1295a87c7a0d67beecb1d827cf12289a4716c43567"} Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.490894 4795 scope.go:117] "RemoveContainer" containerID="a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.490768 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5g9w" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.519659 4795 scope.go:117] "RemoveContainer" containerID="d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.555531 4795 scope.go:117] "RemoveContainer" containerID="cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.555813 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.586927 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b5g9w"] Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.623576 4795 scope.go:117] "RemoveContainer" containerID="a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df" Dec 05 08:58:00 crc kubenswrapper[4795]: E1205 08:58:00.631241 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df\": container with ID starting with a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df not found: ID does not exist" containerID="a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.631311 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df"} err="failed to get container status \"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df\": rpc error: code = NotFound desc = could not find container \"a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df\": container with ID starting with a7a0a4d6163e405e352714f6b0ccde866a898faf644f3c183ee0a05a044de9df not found: ID does not exist" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.631348 4795 scope.go:117] "RemoveContainer" containerID="d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3" Dec 05 08:58:00 crc kubenswrapper[4795]: E1205 08:58:00.632878 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3\": container with ID starting with d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3 not found: ID does not exist" containerID="d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.632937 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3"} err="failed to get container status \"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3\": rpc error: code = NotFound desc = could not find container \"d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3\": container with ID starting with d3edf1cf02a4a5c0e3a3149895edd3a0f188921f65b45e7be3c01e2ebdff47c3 not found: ID does not exist" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.632975 4795 scope.go:117] "RemoveContainer" containerID="cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251" Dec 05 08:58:00 crc kubenswrapper[4795]: E1205 08:58:00.633701 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251\": container with ID starting with cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251 not found: ID does not exist" containerID="cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.633737 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251"} err="failed to get container status \"cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251\": rpc error: code = NotFound desc = could not find container \"cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251\": container with ID starting with cb289e1397fe380415fb83e33446498fd54b59bf65d4b22220ef450c91047251 not found: ID does not exist" Dec 05 08:58:00 crc kubenswrapper[4795]: I1205 08:58:00.759222 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" path="/var/lib/kubelet/pods/9f1f421c-0395-44a6-b922-02affb104c0f/volumes" Dec 05 08:58:01 crc kubenswrapper[4795]: I1205 08:58:01.270123 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:58:01 crc kubenswrapper[4795]: I1205 08:58:01.508263 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2n79" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="registry-server" containerID="cri-o://8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438" gracePeriod=2 Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.050802 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.109770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkqq9\" (UniqueName: \"kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9\") pod \"04277004-cab9-4a24-b2fc-7a7383138f18\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.109840 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content\") pod \"04277004-cab9-4a24-b2fc-7a7383138f18\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.110050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities\") pod \"04277004-cab9-4a24-b2fc-7a7383138f18\" (UID: \"04277004-cab9-4a24-b2fc-7a7383138f18\") " Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.110874 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities" (OuterVolumeSpecName: "utilities") pod "04277004-cab9-4a24-b2fc-7a7383138f18" (UID: "04277004-cab9-4a24-b2fc-7a7383138f18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.139410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04277004-cab9-4a24-b2fc-7a7383138f18" (UID: "04277004-cab9-4a24-b2fc-7a7383138f18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.141925 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9" (OuterVolumeSpecName: "kube-api-access-rkqq9") pod "04277004-cab9-4a24-b2fc-7a7383138f18" (UID: "04277004-cab9-4a24-b2fc-7a7383138f18"). InnerVolumeSpecName "kube-api-access-rkqq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.212520 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.212786 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkqq9\" (UniqueName: \"kubernetes.io/projected/04277004-cab9-4a24-b2fc-7a7383138f18-kube-api-access-rkqq9\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.212848 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04277004-cab9-4a24-b2fc-7a7383138f18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.538252 4795 generic.go:334] "Generic (PLEG): container finished" podID="04277004-cab9-4a24-b2fc-7a7383138f18" containerID="8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438" exitCode=0 Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.538324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerDied","Data":"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438"} Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.538387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2n79" event={"ID":"04277004-cab9-4a24-b2fc-7a7383138f18","Type":"ContainerDied","Data":"6d820275bcff1b52e1989d536cc10b07e87b1eab4b6f8ee509aadd562d85cf98"} Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.538414 4795 scope.go:117] "RemoveContainer" containerID="8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.538682 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2n79" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.584789 4795 scope.go:117] "RemoveContainer" containerID="6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.618730 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.658527 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2n79"] Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.662968 4795 scope.go:117] "RemoveContainer" containerID="423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.767699 4795 scope.go:117] "RemoveContainer" containerID="8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.768697 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" path="/var/lib/kubelet/pods/04277004-cab9-4a24-b2fc-7a7383138f18/volumes" Dec 05 08:58:02 crc kubenswrapper[4795]: E1205 08:58:02.769255 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438\": container with ID starting with 8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438 not found: ID does not exist" containerID="8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.769325 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438"} err="failed to get container status \"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438\": rpc error: code = NotFound desc = could not find container \"8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438\": container with ID starting with 8e7296e9bcd5edad91c2d474e8ef7aa16d6c2298c238e5c55e48865689186438 not found: ID does not exist" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.769411 4795 scope.go:117] "RemoveContainer" containerID="6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca" Dec 05 08:58:02 crc kubenswrapper[4795]: E1205 08:58:02.779467 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca\": container with ID starting with 6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca not found: ID does not exist" containerID="6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.779531 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca"} err="failed to get container status \"6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca\": rpc error: code = NotFound desc = could not find container \"6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca\": container with ID starting with 6ff0a22f70553f47edb92d14bbce713cc0db97cfcd7622af4fffba9339aa51ca not found: ID does not exist" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.779572 4795 scope.go:117] "RemoveContainer" containerID="423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de" Dec 05 08:58:02 crc kubenswrapper[4795]: E1205 08:58:02.782990 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de\": container with ID starting with 423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de not found: ID does not exist" containerID="423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de" Dec 05 08:58:02 crc kubenswrapper[4795]: I1205 08:58:02.783049 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de"} err="failed to get container status \"423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de\": rpc error: code = NotFound desc = could not find container \"423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de\": container with ID starting with 423002454fc256ff05a5f24645cad8f419388b7fc1a7258cb89ff39a16bbc7de not found: ID does not exist" Dec 05 08:58:18 crc kubenswrapper[4795]: I1205 08:58:18.050271 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bml8k"] Dec 05 08:58:18 crc kubenswrapper[4795]: I1205 08:58:18.058084 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bml8k"] Dec 05 08:58:18 crc kubenswrapper[4795]: I1205 08:58:18.771807 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22800d9f-49d9-4d82-b097-0e3e52a3d311" path="/var/lib/kubelet/pods/22800d9f-49d9-4d82-b097-0e3e52a3d311/volumes" Dec 05 08:58:21 crc kubenswrapper[4795]: I1205 08:58:21.737783 4795 generic.go:334] "Generic (PLEG): container finished" podID="fdfda940-145b-497f-8adc-d001a4f852ed" containerID="e54c1eaefc731d7f1fb6d7b03e9d0c7821b45c16bb629fc6f6bededc8a2a36ac" exitCode=0 Dec 05 08:58:21 crc kubenswrapper[4795]: I1205 08:58:21.737960 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" event={"ID":"fdfda940-145b-497f-8adc-d001a4f852ed","Type":"ContainerDied","Data":"e54c1eaefc731d7f1fb6d7b03e9d0c7821b45c16bb629fc6f6bededc8a2a36ac"} Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.229754 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.330789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory\") pod \"fdfda940-145b-497f-8adc-d001a4f852ed\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.331032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6bvb\" (UniqueName: \"kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb\") pod \"fdfda940-145b-497f-8adc-d001a4f852ed\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.331377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key\") pod \"fdfda940-145b-497f-8adc-d001a4f852ed\" (UID: \"fdfda940-145b-497f-8adc-d001a4f852ed\") " Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.338387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb" (OuterVolumeSpecName: "kube-api-access-n6bvb") pod "fdfda940-145b-497f-8adc-d001a4f852ed" (UID: "fdfda940-145b-497f-8adc-d001a4f852ed"). InnerVolumeSpecName "kube-api-access-n6bvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.371284 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fdfda940-145b-497f-8adc-d001a4f852ed" (UID: "fdfda940-145b-497f-8adc-d001a4f852ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.372324 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory" (OuterVolumeSpecName: "inventory") pod "fdfda940-145b-497f-8adc-d001a4f852ed" (UID: "fdfda940-145b-497f-8adc-d001a4f852ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.434292 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.434348 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fdfda940-145b-497f-8adc-d001a4f852ed-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.434364 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6bvb\" (UniqueName: \"kubernetes.io/projected/fdfda940-145b-497f-8adc-d001a4f852ed-kube-api-access-n6bvb\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.774954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" event={"ID":"fdfda940-145b-497f-8adc-d001a4f852ed","Type":"ContainerDied","Data":"b0480bd6d8424f875278dd3cf5ba19ddd6f797bb921a6940c349b5d21ffc9c36"} Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.775487 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0480bd6d8424f875278dd3cf5ba19ddd6f797bb921a6940c349b5d21ffc9c36" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.775582 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.897485 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg"] Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900834 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdfda940-145b-497f-8adc-d001a4f852ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.900877 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfda940-145b-497f-8adc-d001a4f852ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900895 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.900903 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900937 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="extract-utilities" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.900946 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="extract-utilities" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900956 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="extract-content" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.900962 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="extract-content" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900973 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="extract-content" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.900979 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="extract-content" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.900994 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="extract-utilities" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.901000 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="extract-utilities" Dec 05 08:58:23 crc kubenswrapper[4795]: E1205 08:58:23.901019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.901027 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.901237 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="04277004-cab9-4a24-b2fc-7a7383138f18" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.901254 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1f421c-0395-44a6-b922-02affb104c0f" containerName="registry-server" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.901267 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdfda940-145b-497f-8adc-d001a4f852ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.902101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.908384 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.908419 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.909149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.917431 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.925093 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg"] Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.953635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.955045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:23 crc kubenswrapper[4795]: I1205 08:58:23.955427 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmfj\" (UniqueName: \"kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.057810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.058300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.058533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmfj\" (UniqueName: \"kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.062429 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.062762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.085209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmfj\" (UniqueName: \"kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.236179 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:24 crc kubenswrapper[4795]: I1205 08:58:24.888563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg"] Dec 05 08:58:25 crc kubenswrapper[4795]: I1205 08:58:25.795024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" event={"ID":"b98ff6b5-6e26-499a-a777-922aaa749b13","Type":"ContainerStarted","Data":"9992afe58c4f41e22091ac9931257e6230a017bdc048d8d5290f5ada4ddf4fcf"} Dec 05 08:58:25 crc kubenswrapper[4795]: I1205 08:58:25.795811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" event={"ID":"b98ff6b5-6e26-499a-a777-922aaa749b13","Type":"ContainerStarted","Data":"deceef7705ccb9772261465a19b53fe103b7c2d663167fda0d9006431777125a"} Dec 05 08:58:25 crc kubenswrapper[4795]: I1205 08:58:25.825570 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" podStartSLOduration=2.6400843370000002 podStartE2EDuration="2.825545595s" podCreationTimestamp="2025-12-05 08:58:23 +0000 UTC" firstStartedPulling="2025-12-05 08:58:24.886183635 +0000 UTC m=+2056.458787364" lastFinishedPulling="2025-12-05 08:58:25.071644873 +0000 UTC m=+2056.644248622" observedRunningTime="2025-12-05 08:58:25.814132802 +0000 UTC m=+2057.386736541" watchObservedRunningTime="2025-12-05 08:58:25.825545595 +0000 UTC m=+2057.398149334" Dec 05 08:58:31 crc kubenswrapper[4795]: I1205 08:58:31.867892 4795 generic.go:334] "Generic (PLEG): container finished" podID="b98ff6b5-6e26-499a-a777-922aaa749b13" containerID="9992afe58c4f41e22091ac9931257e6230a017bdc048d8d5290f5ada4ddf4fcf" exitCode=0 Dec 05 08:58:31 crc kubenswrapper[4795]: I1205 08:58:31.868006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" event={"ID":"b98ff6b5-6e26-499a-a777-922aaa749b13","Type":"ContainerDied","Data":"9992afe58c4f41e22091ac9931257e6230a017bdc048d8d5290f5ada4ddf4fcf"} Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.377794 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.532082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory\") pod \"b98ff6b5-6e26-499a-a777-922aaa749b13\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.532467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwmfj\" (UniqueName: \"kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj\") pod \"b98ff6b5-6e26-499a-a777-922aaa749b13\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.532587 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key\") pod \"b98ff6b5-6e26-499a-a777-922aaa749b13\" (UID: \"b98ff6b5-6e26-499a-a777-922aaa749b13\") " Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.540846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj" (OuterVolumeSpecName: "kube-api-access-rwmfj") pod "b98ff6b5-6e26-499a-a777-922aaa749b13" (UID: "b98ff6b5-6e26-499a-a777-922aaa749b13"). InnerVolumeSpecName "kube-api-access-rwmfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.566688 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b98ff6b5-6e26-499a-a777-922aaa749b13" (UID: "b98ff6b5-6e26-499a-a777-922aaa749b13"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.568032 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory" (OuterVolumeSpecName: "inventory") pod "b98ff6b5-6e26-499a-a777-922aaa749b13" (UID: "b98ff6b5-6e26-499a-a777-922aaa749b13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.635039 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.635284 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwmfj\" (UniqueName: \"kubernetes.io/projected/b98ff6b5-6e26-499a-a777-922aaa749b13-kube-api-access-rwmfj\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.635397 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b98ff6b5-6e26-499a-a777-922aaa749b13-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.888750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" event={"ID":"b98ff6b5-6e26-499a-a777-922aaa749b13","Type":"ContainerDied","Data":"deceef7705ccb9772261465a19b53fe103b7c2d663167fda0d9006431777125a"} Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.889183 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deceef7705ccb9772261465a19b53fe103b7c2d663167fda0d9006431777125a" Dec 05 08:58:33 crc kubenswrapper[4795]: I1205 08:58:33.888856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.053866 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn"] Dec 05 08:58:34 crc kubenswrapper[4795]: E1205 08:58:34.054732 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98ff6b5-6e26-499a-a777-922aaa749b13" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.054889 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98ff6b5-6e26-499a-a777-922aaa749b13" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.055148 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98ff6b5-6e26-499a-a777-922aaa749b13" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.056019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.059038 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.059256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.068092 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.068317 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.069887 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn"] Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.251487 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.252046 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.252181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45zjh\" (UniqueName: \"kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.358099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.358322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.358760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45zjh\" (UniqueName: \"kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.364838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.374422 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.386016 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45zjh\" (UniqueName: \"kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tg6pn\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.389119 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:58:34 crc kubenswrapper[4795]: I1205 08:58:34.944137 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn"] Dec 05 08:58:35 crc kubenswrapper[4795]: I1205 08:58:35.912530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" event={"ID":"d2617c20-6235-43aa-85e0-6bed6d4649e3","Type":"ContainerStarted","Data":"1cf40e4c2c61e3b7b678c88ab6ef751d90da3794f89830406e9d01490a5f6ca5"} Dec 05 08:58:35 crc kubenswrapper[4795]: I1205 08:58:35.913584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" event={"ID":"d2617c20-6235-43aa-85e0-6bed6d4649e3","Type":"ContainerStarted","Data":"317f8f114194376266be7ddb2a6a171c902103ef889e4387a0612e86e698cc74"} Dec 05 08:58:35 crc kubenswrapper[4795]: I1205 08:58:35.940741 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" podStartSLOduration=1.791065138 podStartE2EDuration="1.940709644s" podCreationTimestamp="2025-12-05 08:58:34 +0000 UTC" firstStartedPulling="2025-12-05 08:58:34.956900302 +0000 UTC m=+2066.529504041" lastFinishedPulling="2025-12-05 08:58:35.106544808 +0000 UTC m=+2066.679148547" observedRunningTime="2025-12-05 08:58:35.937247222 +0000 UTC m=+2067.509850961" watchObservedRunningTime="2025-12-05 08:58:35.940709644 +0000 UTC m=+2067.513313413" Dec 05 08:58:52 crc kubenswrapper[4795]: I1205 08:58:52.590005 4795 scope.go:117] "RemoveContainer" containerID="3d09817b423c83115fc9d7d2accc81bcbc9a252ef25ea40944df638f9a422b69" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.766014 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.769561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.800448 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.889815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.889894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4gz9\" (UniqueName: \"kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.889968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.994930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.994988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4gz9\" (UniqueName: \"kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.995030 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.995704 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:03 crc kubenswrapper[4795]: I1205 08:59:03.995832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:04 crc kubenswrapper[4795]: I1205 08:59:04.017564 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4gz9\" (UniqueName: \"kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9\") pod \"redhat-operators-9b98w\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:04 crc kubenswrapper[4795]: I1205 08:59:04.108318 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:04 crc kubenswrapper[4795]: I1205 08:59:04.666917 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:05 crc kubenswrapper[4795]: I1205 08:59:05.312396 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerID="84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016" exitCode=0 Dec 05 08:59:05 crc kubenswrapper[4795]: I1205 08:59:05.313757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerDied","Data":"84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016"} Dec 05 08:59:05 crc kubenswrapper[4795]: I1205 08:59:05.313911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerStarted","Data":"7362ad8d3d0d2cbf09a5561044a3735584911b3008b0f0baf7c23e124c65da30"} Dec 05 08:59:06 crc kubenswrapper[4795]: I1205 08:59:06.334389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerStarted","Data":"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c"} Dec 05 08:59:10 crc kubenswrapper[4795]: I1205 08:59:10.415665 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerID="9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c" exitCode=0 Dec 05 08:59:10 crc kubenswrapper[4795]: I1205 08:59:10.416197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerDied","Data":"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c"} Dec 05 08:59:11 crc kubenswrapper[4795]: I1205 08:59:11.428728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerStarted","Data":"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e"} Dec 05 08:59:11 crc kubenswrapper[4795]: I1205 08:59:11.460133 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9b98w" podStartSLOduration=2.967046226 podStartE2EDuration="8.460108406s" podCreationTimestamp="2025-12-05 08:59:03 +0000 UTC" firstStartedPulling="2025-12-05 08:59:05.314777285 +0000 UTC m=+2096.887381024" lastFinishedPulling="2025-12-05 08:59:10.807839475 +0000 UTC m=+2102.380443204" observedRunningTime="2025-12-05 08:59:11.456241454 +0000 UTC m=+2103.028845213" watchObservedRunningTime="2025-12-05 08:59:11.460108406 +0000 UTC m=+2103.032712145" Dec 05 08:59:14 crc kubenswrapper[4795]: I1205 08:59:14.109423 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:14 crc kubenswrapper[4795]: I1205 08:59:14.109981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:15 crc kubenswrapper[4795]: I1205 08:59:15.171294 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9b98w" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="registry-server" probeResult="failure" output=< Dec 05 08:59:15 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 08:59:15 crc kubenswrapper[4795]: > Dec 05 08:59:22 crc kubenswrapper[4795]: I1205 08:59:22.005652 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2617c20-6235-43aa-85e0-6bed6d4649e3" containerID="1cf40e4c2c61e3b7b678c88ab6ef751d90da3794f89830406e9d01490a5f6ca5" exitCode=0 Dec 05 08:59:22 crc kubenswrapper[4795]: I1205 08:59:22.005737 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" event={"ID":"d2617c20-6235-43aa-85e0-6bed6d4649e3","Type":"ContainerDied","Data":"1cf40e4c2c61e3b7b678c88ab6ef751d90da3794f89830406e9d01490a5f6ca5"} Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.467271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.614841 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key\") pod \"d2617c20-6235-43aa-85e0-6bed6d4649e3\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.615339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory\") pod \"d2617c20-6235-43aa-85e0-6bed6d4649e3\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.615416 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45zjh\" (UniqueName: \"kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh\") pod \"d2617c20-6235-43aa-85e0-6bed6d4649e3\" (UID: \"d2617c20-6235-43aa-85e0-6bed6d4649e3\") " Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.630833 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh" (OuterVolumeSpecName: "kube-api-access-45zjh") pod "d2617c20-6235-43aa-85e0-6bed6d4649e3" (UID: "d2617c20-6235-43aa-85e0-6bed6d4649e3"). InnerVolumeSpecName "kube-api-access-45zjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.655907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory" (OuterVolumeSpecName: "inventory") pod "d2617c20-6235-43aa-85e0-6bed6d4649e3" (UID: "d2617c20-6235-43aa-85e0-6bed6d4649e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.679812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2617c20-6235-43aa-85e0-6bed6d4649e3" (UID: "d2617c20-6235-43aa-85e0-6bed6d4649e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.718363 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45zjh\" (UniqueName: \"kubernetes.io/projected/d2617c20-6235-43aa-85e0-6bed6d4649e3-kube-api-access-45zjh\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.718398 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:23 crc kubenswrapper[4795]: I1205 08:59:23.718408 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2617c20-6235-43aa-85e0-6bed6d4649e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.027479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" event={"ID":"d2617c20-6235-43aa-85e0-6bed6d4649e3","Type":"ContainerDied","Data":"317f8f114194376266be7ddb2a6a171c902103ef889e4387a0612e86e698cc74"} Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.027534 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="317f8f114194376266be7ddb2a6a171c902103ef889e4387a0612e86e698cc74" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.027553 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tg6pn" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.128685 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz"] Dec 05 08:59:24 crc kubenswrapper[4795]: E1205 08:59:24.129528 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2617c20-6235-43aa-85e0-6bed6d4649e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.129648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2617c20-6235-43aa-85e0-6bed6d4649e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.129946 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2617c20-6235-43aa-85e0-6bed6d4649e3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.131198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.134013 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.134895 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.135111 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.135278 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.154017 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz"] Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.185064 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.230386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.230467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.230853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsshh\" (UniqueName: \"kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.253357 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.333137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsshh\" (UniqueName: \"kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.333563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.333736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.338139 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.338792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.350198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsshh\" (UniqueName: \"kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.424258 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:24 crc kubenswrapper[4795]: I1205 08:59:24.450682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 08:59:25 crc kubenswrapper[4795]: I1205 08:59:25.077639 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz"] Dec 05 08:59:25 crc kubenswrapper[4795]: W1205 08:59:25.081232 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2733fd67_3848_4b52_8246_0aa3a4f60d10.slice/crio-340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1 WatchSource:0}: Error finding container 340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1: Status 404 returned error can't find the container with id 340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1 Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.060762 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" event={"ID":"2733fd67-3848-4b52-8246-0aa3a4f60d10","Type":"ContainerStarted","Data":"3766af2eb25a569c9db0932d0efc3e7c11dab1303454f9f24c6ab3896c6da26d"} Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.061240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" event={"ID":"2733fd67-3848-4b52-8246-0aa3a4f60d10","Type":"ContainerStarted","Data":"340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1"} Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.060897 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9b98w" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="registry-server" containerID="cri-o://cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e" gracePeriod=2 Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.091304 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" podStartSLOduration=1.903669557 podStartE2EDuration="2.091282663s" podCreationTimestamp="2025-12-05 08:59:24 +0000 UTC" firstStartedPulling="2025-12-05 08:59:25.08336269 +0000 UTC m=+2116.655966429" lastFinishedPulling="2025-12-05 08:59:25.270975796 +0000 UTC m=+2116.843579535" observedRunningTime="2025-12-05 08:59:26.085008826 +0000 UTC m=+2117.657612565" watchObservedRunningTime="2025-12-05 08:59:26.091282663 +0000 UTC m=+2117.663886402" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.597898 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.688764 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities\") pod \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.688911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4gz9\" (UniqueName: \"kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9\") pod \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.688968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content\") pod \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\" (UID: \"5e71fb9b-7b01-435e-8bf3-f917b681a7cd\") " Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.699451 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities" (OuterVolumeSpecName: "utilities") pod "5e71fb9b-7b01-435e-8bf3-f917b681a7cd" (UID: "5e71fb9b-7b01-435e-8bf3-f917b681a7cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.710149 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9" (OuterVolumeSpecName: "kube-api-access-c4gz9") pod "5e71fb9b-7b01-435e-8bf3-f917b681a7cd" (UID: "5e71fb9b-7b01-435e-8bf3-f917b681a7cd"). InnerVolumeSpecName "kube-api-access-c4gz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.791758 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4gz9\" (UniqueName: \"kubernetes.io/projected/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-kube-api-access-c4gz9\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.791817 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.813979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e71fb9b-7b01-435e-8bf3-f917b681a7cd" (UID: "5e71fb9b-7b01-435e-8bf3-f917b681a7cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 08:59:26 crc kubenswrapper[4795]: I1205 08:59:26.894146 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e71fb9b-7b01-435e-8bf3-f917b681a7cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.075674 4795 generic.go:334] "Generic (PLEG): container finished" podID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerID="cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e" exitCode=0 Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.075821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerDied","Data":"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e"} Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.075889 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9b98w" event={"ID":"5e71fb9b-7b01-435e-8bf3-f917b681a7cd","Type":"ContainerDied","Data":"7362ad8d3d0d2cbf09a5561044a3735584911b3008b0f0baf7c23e124c65da30"} Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.075921 4795 scope.go:117] "RemoveContainer" containerID="cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.075812 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9b98w" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.112104 4795 scope.go:117] "RemoveContainer" containerID="9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.120069 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.130651 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9b98w"] Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.154425 4795 scope.go:117] "RemoveContainer" containerID="84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.199799 4795 scope.go:117] "RemoveContainer" containerID="cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e" Dec 05 08:59:27 crc kubenswrapper[4795]: E1205 08:59:27.200747 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e\": container with ID starting with cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e not found: ID does not exist" containerID="cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.200819 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e"} err="failed to get container status \"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e\": rpc error: code = NotFound desc = could not find container \"cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e\": container with ID starting with cbcff1517d76b7a1f1489223b3dc01370bbfb3b3369e56865a10a33d0b9b371e not found: ID does not exist" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.200857 4795 scope.go:117] "RemoveContainer" containerID="9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c" Dec 05 08:59:27 crc kubenswrapper[4795]: E1205 08:59:27.201218 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c\": container with ID starting with 9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c not found: ID does not exist" containerID="9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.201256 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c"} err="failed to get container status \"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c\": rpc error: code = NotFound desc = could not find container \"9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c\": container with ID starting with 9e15c9f1d80d8b95c9f1183ce24fc8b1177e6a9bf1177a6b65d58aaf26f4835c not found: ID does not exist" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.201282 4795 scope.go:117] "RemoveContainer" containerID="84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016" Dec 05 08:59:27 crc kubenswrapper[4795]: E1205 08:59:27.201583 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016\": container with ID starting with 84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016 not found: ID does not exist" containerID="84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016" Dec 05 08:59:27 crc kubenswrapper[4795]: I1205 08:59:27.201638 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016"} err="failed to get container status \"84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016\": rpc error: code = NotFound desc = could not find container \"84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016\": container with ID starting with 84d5574c0a80eb023142b7469a418e66a9e70d09919a40a945786daa21b14016 not found: ID does not exist" Dec 05 08:59:28 crc kubenswrapper[4795]: I1205 08:59:28.763576 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" path="/var/lib/kubelet/pods/5e71fb9b-7b01-435e-8bf3-f917b681a7cd/volumes" Dec 05 08:59:40 crc kubenswrapper[4795]: I1205 08:59:40.826966 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 08:59:40 crc kubenswrapper[4795]: I1205 08:59:40.827709 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.186491 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5"] Dec 05 09:00:00 crc kubenswrapper[4795]: E1205 09:00:00.187731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="extract-utilities" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.187749 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="extract-utilities" Dec 05 09:00:00 crc kubenswrapper[4795]: E1205 09:00:00.187777 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="registry-server" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.187786 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="registry-server" Dec 05 09:00:00 crc kubenswrapper[4795]: E1205 09:00:00.187813 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="extract-content" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.187823 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="extract-content" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.188074 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e71fb9b-7b01-435e-8bf3-f917b681a7cd" containerName="registry-server" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.189034 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.192146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.192295 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.211656 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5"] Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.314025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlc7\" (UniqueName: \"kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.314167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.314206 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.415564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.415939 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.416118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlc7\" (UniqueName: \"kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.417431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.431404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.441921 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlc7\" (UniqueName: \"kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7\") pod \"collect-profiles-29415420-h87z5\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:00 crc kubenswrapper[4795]: I1205 09:00:00.524573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:01 crc kubenswrapper[4795]: I1205 09:00:01.250657 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5"] Dec 05 09:00:01 crc kubenswrapper[4795]: I1205 09:00:01.510484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" event={"ID":"b81a91e2-dcb4-4743-8fa5-836b060e27f1","Type":"ContainerStarted","Data":"3042ebd2620f8e1e4debe629d6bd4b7aacbfd04de32e88efa080ad2eb811ccc8"} Dec 05 09:00:01 crc kubenswrapper[4795]: I1205 09:00:01.510536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" event={"ID":"b81a91e2-dcb4-4743-8fa5-836b060e27f1","Type":"ContainerStarted","Data":"499b2cc3fd437abc69eba6fc67757185c979c9be0f93beb1f97a5cb872c8a72e"} Dec 05 09:00:01 crc kubenswrapper[4795]: I1205 09:00:01.533480 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" podStartSLOduration=1.5334540840000002 podStartE2EDuration="1.533454084s" podCreationTimestamp="2025-12-05 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 09:00:01.53253003 +0000 UTC m=+2153.105133769" watchObservedRunningTime="2025-12-05 09:00:01.533454084 +0000 UTC m=+2153.106057823" Dec 05 09:00:02 crc kubenswrapper[4795]: I1205 09:00:02.527278 4795 generic.go:334] "Generic (PLEG): container finished" podID="b81a91e2-dcb4-4743-8fa5-836b060e27f1" containerID="3042ebd2620f8e1e4debe629d6bd4b7aacbfd04de32e88efa080ad2eb811ccc8" exitCode=0 Dec 05 09:00:02 crc kubenswrapper[4795]: I1205 09:00:02.527426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" event={"ID":"b81a91e2-dcb4-4743-8fa5-836b060e27f1","Type":"ContainerDied","Data":"3042ebd2620f8e1e4debe629d6bd4b7aacbfd04de32e88efa080ad2eb811ccc8"} Dec 05 09:00:03 crc kubenswrapper[4795]: I1205 09:00:03.903130 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:03 crc kubenswrapper[4795]: I1205 09:00:03.996129 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlc7\" (UniqueName: \"kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7\") pod \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " Dec 05 09:00:03 crc kubenswrapper[4795]: I1205 09:00:03.996175 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume\") pod \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " Dec 05 09:00:03 crc kubenswrapper[4795]: I1205 09:00:03.996206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume\") pod \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\" (UID: \"b81a91e2-dcb4-4743-8fa5-836b060e27f1\") " Dec 05 09:00:03 crc kubenswrapper[4795]: I1205 09:00:03.997599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "b81a91e2-dcb4-4743-8fa5-836b060e27f1" (UID: "b81a91e2-dcb4-4743-8fa5-836b060e27f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.012481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b81a91e2-dcb4-4743-8fa5-836b060e27f1" (UID: "b81a91e2-dcb4-4743-8fa5-836b060e27f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.012934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7" (OuterVolumeSpecName: "kube-api-access-6zlc7") pod "b81a91e2-dcb4-4743-8fa5-836b060e27f1" (UID: "b81a91e2-dcb4-4743-8fa5-836b060e27f1"). InnerVolumeSpecName "kube-api-access-6zlc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.099538 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zlc7\" (UniqueName: \"kubernetes.io/projected/b81a91e2-dcb4-4743-8fa5-836b060e27f1-kube-api-access-6zlc7\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.099590 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b81a91e2-dcb4-4743-8fa5-836b060e27f1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.099602 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b81a91e2-dcb4-4743-8fa5-836b060e27f1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.401694 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t"] Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.409161 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415375-fcl8t"] Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.548737 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" event={"ID":"b81a91e2-dcb4-4743-8fa5-836b060e27f1","Type":"ContainerDied","Data":"499b2cc3fd437abc69eba6fc67757185c979c9be0f93beb1f97a5cb872c8a72e"} Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.548784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.548796 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499b2cc3fd437abc69eba6fc67757185c979c9be0f93beb1f97a5cb872c8a72e" Dec 05 09:00:04 crc kubenswrapper[4795]: I1205 09:00:04.760695 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da05ac8-31e6-4fb6-b8d4-b10d5cc26821" path="/var/lib/kubelet/pods/9da05ac8-31e6-4fb6-b8d4-b10d5cc26821/volumes" Dec 05 09:00:10 crc kubenswrapper[4795]: I1205 09:00:10.826982 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:00:10 crc kubenswrapper[4795]: I1205 09:00:10.827546 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:00:23 crc kubenswrapper[4795]: I1205 09:00:23.748150 4795 generic.go:334] "Generic (PLEG): container finished" podID="2733fd67-3848-4b52-8246-0aa3a4f60d10" containerID="3766af2eb25a569c9db0932d0efc3e7c11dab1303454f9f24c6ab3896c6da26d" exitCode=0 Dec 05 09:00:23 crc kubenswrapper[4795]: I1205 09:00:23.748261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" event={"ID":"2733fd67-3848-4b52-8246-0aa3a4f60d10","Type":"ContainerDied","Data":"3766af2eb25a569c9db0932d0efc3e7c11dab1303454f9f24c6ab3896c6da26d"} Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.203001 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.303288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsshh\" (UniqueName: \"kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh\") pod \"2733fd67-3848-4b52-8246-0aa3a4f60d10\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.303512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory\") pod \"2733fd67-3848-4b52-8246-0aa3a4f60d10\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.303707 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key\") pod \"2733fd67-3848-4b52-8246-0aa3a4f60d10\" (UID: \"2733fd67-3848-4b52-8246-0aa3a4f60d10\") " Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.312376 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh" (OuterVolumeSpecName: "kube-api-access-rsshh") pod "2733fd67-3848-4b52-8246-0aa3a4f60d10" (UID: "2733fd67-3848-4b52-8246-0aa3a4f60d10"). InnerVolumeSpecName "kube-api-access-rsshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.340931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory" (OuterVolumeSpecName: "inventory") pod "2733fd67-3848-4b52-8246-0aa3a4f60d10" (UID: "2733fd67-3848-4b52-8246-0aa3a4f60d10"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.363992 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2733fd67-3848-4b52-8246-0aa3a4f60d10" (UID: "2733fd67-3848-4b52-8246-0aa3a4f60d10"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.406952 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.406998 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsshh\" (UniqueName: \"kubernetes.io/projected/2733fd67-3848-4b52-8246-0aa3a4f60d10-kube-api-access-rsshh\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.407015 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2733fd67-3848-4b52-8246-0aa3a4f60d10-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.780306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" event={"ID":"2733fd67-3848-4b52-8246-0aa3a4f60d10","Type":"ContainerDied","Data":"340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1"} Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.780377 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340e62961eea248cd5222ab62e73c7af7c4427b56c547c6f7cb57d66701164c1" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.780527 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.906924 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ktqqs"] Dec 05 09:00:25 crc kubenswrapper[4795]: E1205 09:00:25.907469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2733fd67-3848-4b52-8246-0aa3a4f60d10" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.907490 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2733fd67-3848-4b52-8246-0aa3a4f60d10" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:25 crc kubenswrapper[4795]: E1205 09:00:25.907541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b81a91e2-dcb4-4743-8fa5-836b060e27f1" containerName="collect-profiles" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.907549 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b81a91e2-dcb4-4743-8fa5-836b060e27f1" containerName="collect-profiles" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.907749 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2733fd67-3848-4b52-8246-0aa3a4f60d10" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.907770 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b81a91e2-dcb4-4743-8fa5-836b060e27f1" containerName="collect-profiles" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.908721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.922114 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ktqqs"] Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.922273 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.922572 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.922587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:00:25 crc kubenswrapper[4795]: I1205 09:00:25.922839 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.026294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.026349 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.026466 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm25n\" (UniqueName: \"kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.128581 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.128701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.128898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm25n\" (UniqueName: \"kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.134462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.135856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.154626 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm25n\" (UniqueName: \"kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n\") pod \"ssh-known-hosts-edpm-deployment-ktqqs\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.281482 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:26 crc kubenswrapper[4795]: I1205 09:00:26.851599 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ktqqs"] Dec 05 09:00:27 crc kubenswrapper[4795]: I1205 09:00:27.798693 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" event={"ID":"257beaf8-8804-48c7-ac78-c12ace238dd2","Type":"ContainerStarted","Data":"3a41db017114aa5ebef0b5174d9786ec400734ee641dc9aa9c233cfa86a6c30f"} Dec 05 09:00:27 crc kubenswrapper[4795]: I1205 09:00:27.799041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" event={"ID":"257beaf8-8804-48c7-ac78-c12ace238dd2","Type":"ContainerStarted","Data":"5a883f1842553b737171e2ad244f4da711f41f5484cad138792538ba04a14d6c"} Dec 05 09:00:27 crc kubenswrapper[4795]: I1205 09:00:27.825334 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" podStartSLOduration=2.6643602680000003 podStartE2EDuration="2.825306535s" podCreationTimestamp="2025-12-05 09:00:25 +0000 UTC" firstStartedPulling="2025-12-05 09:00:26.864812913 +0000 UTC m=+2178.437416652" lastFinishedPulling="2025-12-05 09:00:27.02575916 +0000 UTC m=+2178.598362919" observedRunningTime="2025-12-05 09:00:27.820747304 +0000 UTC m=+2179.393351043" watchObservedRunningTime="2025-12-05 09:00:27.825306535 +0000 UTC m=+2179.397910274" Dec 05 09:00:35 crc kubenswrapper[4795]: I1205 09:00:35.878822 4795 generic.go:334] "Generic (PLEG): container finished" podID="257beaf8-8804-48c7-ac78-c12ace238dd2" containerID="3a41db017114aa5ebef0b5174d9786ec400734ee641dc9aa9c233cfa86a6c30f" exitCode=0 Dec 05 09:00:35 crc kubenswrapper[4795]: I1205 09:00:35.878883 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" event={"ID":"257beaf8-8804-48c7-ac78-c12ace238dd2","Type":"ContainerDied","Data":"3a41db017114aa5ebef0b5174d9786ec400734ee641dc9aa9c233cfa86a6c30f"} Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.345803 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.507865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0\") pod \"257beaf8-8804-48c7-ac78-c12ace238dd2\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.508035 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam\") pod \"257beaf8-8804-48c7-ac78-c12ace238dd2\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.508353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm25n\" (UniqueName: \"kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n\") pod \"257beaf8-8804-48c7-ac78-c12ace238dd2\" (UID: \"257beaf8-8804-48c7-ac78-c12ace238dd2\") " Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.518157 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n" (OuterVolumeSpecName: "kube-api-access-bm25n") pod "257beaf8-8804-48c7-ac78-c12ace238dd2" (UID: "257beaf8-8804-48c7-ac78-c12ace238dd2"). InnerVolumeSpecName "kube-api-access-bm25n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.552854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "257beaf8-8804-48c7-ac78-c12ace238dd2" (UID: "257beaf8-8804-48c7-ac78-c12ace238dd2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.554216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "257beaf8-8804-48c7-ac78-c12ace238dd2" (UID: "257beaf8-8804-48c7-ac78-c12ace238dd2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.614863 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm25n\" (UniqueName: \"kubernetes.io/projected/257beaf8-8804-48c7-ac78-c12ace238dd2-kube-api-access-bm25n\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.615415 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.615434 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/257beaf8-8804-48c7-ac78-c12ace238dd2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.902606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" event={"ID":"257beaf8-8804-48c7-ac78-c12ace238dd2","Type":"ContainerDied","Data":"5a883f1842553b737171e2ad244f4da711f41f5484cad138792538ba04a14d6c"} Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.902794 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a883f1842553b737171e2ad244f4da711f41f5484cad138792538ba04a14d6c" Dec 05 09:00:37 crc kubenswrapper[4795]: I1205 09:00:37.902833 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ktqqs" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.017866 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl"] Dec 05 09:00:38 crc kubenswrapper[4795]: E1205 09:00:38.018461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257beaf8-8804-48c7-ac78-c12ace238dd2" containerName="ssh-known-hosts-edpm-deployment" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.018488 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="257beaf8-8804-48c7-ac78-c12ace238dd2" containerName="ssh-known-hosts-edpm-deployment" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.018778 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="257beaf8-8804-48c7-ac78-c12ace238dd2" containerName="ssh-known-hosts-edpm-deployment" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.019656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.029207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.029320 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.029830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.030130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.038751 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl"] Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.126423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.126490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.126561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9nvs\" (UniqueName: \"kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.228379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.228431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.228507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9nvs\" (UniqueName: \"kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.235885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.240920 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.252774 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9nvs\" (UniqueName: \"kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7xjhl\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:38 crc kubenswrapper[4795]: I1205 09:00:38.367275 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:39 crc kubenswrapper[4795]: I1205 09:00:39.046139 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl"] Dec 05 09:00:39 crc kubenswrapper[4795]: I1205 09:00:39.930161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" event={"ID":"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a","Type":"ContainerStarted","Data":"4fa19fc8799fafd68e82f735f64cfc9d058d4e4b977317b8385d507c03420d39"} Dec 05 09:00:39 crc kubenswrapper[4795]: I1205 09:00:39.930580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" event={"ID":"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a","Type":"ContainerStarted","Data":"2f086cca82a46bc22f3d322ab2c4e3f1b85595d61cf7a9c5bc86271801ac74a4"} Dec 05 09:00:39 crc kubenswrapper[4795]: I1205 09:00:39.960306 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" podStartSLOduration=2.770828918 podStartE2EDuration="2.960268092s" podCreationTimestamp="2025-12-05 09:00:37 +0000 UTC" firstStartedPulling="2025-12-05 09:00:39.061949922 +0000 UTC m=+2190.634553661" lastFinishedPulling="2025-12-05 09:00:39.251389096 +0000 UTC m=+2190.823992835" observedRunningTime="2025-12-05 09:00:39.948727435 +0000 UTC m=+2191.521331164" watchObservedRunningTime="2025-12-05 09:00:39.960268092 +0000 UTC m=+2191.532871831" Dec 05 09:00:40 crc kubenswrapper[4795]: I1205 09:00:40.827244 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:00:40 crc kubenswrapper[4795]: I1205 09:00:40.827731 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:00:40 crc kubenswrapper[4795]: I1205 09:00:40.827828 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:00:40 crc kubenswrapper[4795]: I1205 09:00:40.829082 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:00:40 crc kubenswrapper[4795]: I1205 09:00:40.829179 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255" gracePeriod=600 Dec 05 09:00:40 crc kubenswrapper[4795]: E1205 09:00:40.888771 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23494e8d_0824_46a2_9b0c_c447f1d5e5d0.slice/crio-conmon-3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23494e8d_0824_46a2_9b0c_c447f1d5e5d0.slice/crio-3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255.scope\": RecentStats: unable to find data in memory cache]" Dec 05 09:00:41 crc kubenswrapper[4795]: I1205 09:00:41.952589 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255" exitCode=0 Dec 05 09:00:41 crc kubenswrapper[4795]: I1205 09:00:41.952642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255"} Dec 05 09:00:41 crc kubenswrapper[4795]: I1205 09:00:41.953634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5"} Dec 05 09:00:41 crc kubenswrapper[4795]: I1205 09:00:41.953665 4795 scope.go:117] "RemoveContainer" containerID="566eeda8bbeca02b218637eb2242b9ed8d131b55258f185455e4a24cce385fca" Dec 05 09:00:45 crc kubenswrapper[4795]: I1205 09:00:45.963355 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:00:45 crc kubenswrapper[4795]: I1205 09:00:45.967484 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:45 crc kubenswrapper[4795]: I1205 09:00:45.993857 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.041599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xn9q\" (UniqueName: \"kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.041728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.041849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.143455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.143586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xn9q\" (UniqueName: \"kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.143674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.144363 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.144376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.190056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xn9q\" (UniqueName: \"kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q\") pod \"certified-operators-mwgqk\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.291776 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:46 crc kubenswrapper[4795]: I1205 09:00:46.867084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:00:47 crc kubenswrapper[4795]: I1205 09:00:47.011896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerStarted","Data":"8f9302eee4dbbe006932c047701f8c0d8ce235b645121287a03bfd4ebd502e89"} Dec 05 09:00:48 crc kubenswrapper[4795]: I1205 09:00:48.024882 4795 generic.go:334] "Generic (PLEG): container finished" podID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerID="314075f41e944df5fe962d78d9b31f9f0f35dbba8e520612395965798da76115" exitCode=0 Dec 05 09:00:48 crc kubenswrapper[4795]: I1205 09:00:48.024996 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerDied","Data":"314075f41e944df5fe962d78d9b31f9f0f35dbba8e520612395965798da76115"} Dec 05 09:00:49 crc kubenswrapper[4795]: I1205 09:00:49.039575 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerStarted","Data":"f01f6023497a99bc17ae1d9b7a7ca4ca57c82f33ca06c676da61f144449cf380"} Dec 05 09:00:49 crc kubenswrapper[4795]: I1205 09:00:49.042389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" event={"ID":"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a","Type":"ContainerDied","Data":"4fa19fc8799fafd68e82f735f64cfc9d058d4e4b977317b8385d507c03420d39"} Dec 05 09:00:49 crc kubenswrapper[4795]: I1205 09:00:49.041817 4795 generic.go:334] "Generic (PLEG): container finished" podID="939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" containerID="4fa19fc8799fafd68e82f735f64cfc9d058d4e4b977317b8385d507c03420d39" exitCode=0 Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.052945 4795 generic.go:334] "Generic (PLEG): container finished" podID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerID="f01f6023497a99bc17ae1d9b7a7ca4ca57c82f33ca06c676da61f144449cf380" exitCode=0 Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.053102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerDied","Data":"f01f6023497a99bc17ae1d9b7a7ca4ca57c82f33ca06c676da61f144449cf380"} Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.653579 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.765468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key\") pod \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.765990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9nvs\" (UniqueName: \"kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs\") pod \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.766128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory\") pod \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\" (UID: \"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a\") " Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.775922 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs" (OuterVolumeSpecName: "kube-api-access-g9nvs") pod "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" (UID: "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a"). InnerVolumeSpecName "kube-api-access-g9nvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.799411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory" (OuterVolumeSpecName: "inventory") pod "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" (UID: "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.818161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" (UID: "939bdde4-5c5c-4d05-bf99-f5ae1fe7216a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.869271 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.869980 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9nvs\" (UniqueName: \"kubernetes.io/projected/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-kube-api-access-g9nvs\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:50 crc kubenswrapper[4795]: I1205 09:00:50.870068 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/939bdde4-5c5c-4d05-bf99-f5ae1fe7216a-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.065988 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" event={"ID":"939bdde4-5c5c-4d05-bf99-f5ae1fe7216a","Type":"ContainerDied","Data":"2f086cca82a46bc22f3d322ab2c4e3f1b85595d61cf7a9c5bc86271801ac74a4"} Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.066046 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f086cca82a46bc22f3d322ab2c4e3f1b85595d61cf7a9c5bc86271801ac74a4" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.066003 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7xjhl" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.068395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerStarted","Data":"e8365d5c7c2e951d822c3c6f362f81a602d970a0b707dbd25aaaa36b84b241db"} Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.108397 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwgqk" podStartSLOduration=3.648829449 podStartE2EDuration="6.108363668s" podCreationTimestamp="2025-12-05 09:00:45 +0000 UTC" firstStartedPulling="2025-12-05 09:00:48.028711864 +0000 UTC m=+2199.601315613" lastFinishedPulling="2025-12-05 09:00:50.488246093 +0000 UTC m=+2202.060849832" observedRunningTime="2025-12-05 09:00:51.093834944 +0000 UTC m=+2202.666438683" watchObservedRunningTime="2025-12-05 09:00:51.108363668 +0000 UTC m=+2202.680967407" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.192636 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222"] Dec 05 09:00:51 crc kubenswrapper[4795]: E1205 09:00:51.193173 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.193193 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.193411 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="939bdde4-5c5c-4d05-bf99-f5ae1fe7216a" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.194240 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.203335 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.203574 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.203710 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.203854 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.224495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222"] Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.285485 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.286005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.286063 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ng9f\" (UniqueName: \"kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.387174 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.387519 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.387642 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ng9f\" (UniqueName: \"kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.392526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.392526 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.406028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ng9f\" (UniqueName: \"kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-k9222\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:51 crc kubenswrapper[4795]: I1205 09:00:51.535109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:00:52 crc kubenswrapper[4795]: I1205 09:00:52.164128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222"] Dec 05 09:00:52 crc kubenswrapper[4795]: I1205 09:00:52.813141 4795 scope.go:117] "RemoveContainer" containerID="ccc4fa14d91dac69258353003e0ef4116337a90fa2a6d3594c4e85764a686450" Dec 05 09:00:53 crc kubenswrapper[4795]: I1205 09:00:53.087832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" event={"ID":"c21efe0b-8f08-49d0-9723-8497f78e7471","Type":"ContainerStarted","Data":"a43c36564b02df8bdcd68812e6e44f0d32330eb091af3edb30ab4551c1227b07"} Dec 05 09:00:53 crc kubenswrapper[4795]: I1205 09:00:53.087890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" event={"ID":"c21efe0b-8f08-49d0-9723-8497f78e7471","Type":"ContainerStarted","Data":"05b6b6c349fd49bd2c3304fe9b4ca4098bec37a69eeac4ef778686751a107122"} Dec 05 09:00:53 crc kubenswrapper[4795]: I1205 09:00:53.113300 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" podStartSLOduration=1.956864912 podStartE2EDuration="2.113275399s" podCreationTimestamp="2025-12-05 09:00:51 +0000 UTC" firstStartedPulling="2025-12-05 09:00:52.187822944 +0000 UTC m=+2203.760426683" lastFinishedPulling="2025-12-05 09:00:52.344233431 +0000 UTC m=+2203.916837170" observedRunningTime="2025-12-05 09:00:53.107921094 +0000 UTC m=+2204.680524833" watchObservedRunningTime="2025-12-05 09:00:53.113275399 +0000 UTC m=+2204.685879138" Dec 05 09:00:56 crc kubenswrapper[4795]: I1205 09:00:56.292445 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:56 crc kubenswrapper[4795]: I1205 09:00:56.292864 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:56 crc kubenswrapper[4795]: I1205 09:00:56.343494 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:57 crc kubenswrapper[4795]: I1205 09:00:57.177786 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:00:57 crc kubenswrapper[4795]: I1205 09:00:57.235857 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:00:59 crc kubenswrapper[4795]: I1205 09:00:59.142257 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwgqk" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="registry-server" containerID="cri-o://e8365d5c7c2e951d822c3c6f362f81a602d970a0b707dbd25aaaa36b84b241db" gracePeriod=2 Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.166049 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415421-ppwmt"] Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.168311 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.168490 4795 generic.go:334] "Generic (PLEG): container finished" podID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerID="e8365d5c7c2e951d822c3c6f362f81a602d970a0b707dbd25aaaa36b84b241db" exitCode=0 Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.168540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerDied","Data":"e8365d5c7c2e951d822c3c6f362f81a602d970a0b707dbd25aaaa36b84b241db"} Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.168571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwgqk" event={"ID":"096e4f04-c680-4a72-ae03-5aef60d8e329","Type":"ContainerDied","Data":"8f9302eee4dbbe006932c047701f8c0d8ce235b645121287a03bfd4ebd502e89"} Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.168586 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f9302eee4dbbe006932c047701f8c0d8ce235b645121287a03bfd4ebd502e89" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.174703 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.196927 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415421-ppwmt"] Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.315189 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities\") pod \"096e4f04-c680-4a72-ae03-5aef60d8e329\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.315821 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content\") pod \"096e4f04-c680-4a72-ae03-5aef60d8e329\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.316046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xn9q\" (UniqueName: \"kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q\") pod \"096e4f04-c680-4a72-ae03-5aef60d8e329\" (UID: \"096e4f04-c680-4a72-ae03-5aef60d8e329\") " Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.316337 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pk8\" (UniqueName: \"kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.316379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.316404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.316503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.317651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities" (OuterVolumeSpecName: "utilities") pod "096e4f04-c680-4a72-ae03-5aef60d8e329" (UID: "096e4f04-c680-4a72-ae03-5aef60d8e329"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.327034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q" (OuterVolumeSpecName: "kube-api-access-9xn9q") pod "096e4f04-c680-4a72-ae03-5aef60d8e329" (UID: "096e4f04-c680-4a72-ae03-5aef60d8e329"). InnerVolumeSpecName "kube-api-access-9xn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.371845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "096e4f04-c680-4a72-ae03-5aef60d8e329" (UID: "096e4f04-c680-4a72-ae03-5aef60d8e329"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pk8\" (UniqueName: \"kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418586 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418727 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xn9q\" (UniqueName: \"kubernetes.io/projected/096e4f04-c680-4a72-ae03-5aef60d8e329-kube-api-access-9xn9q\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418746 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.418759 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/096e4f04-c680-4a72-ae03-5aef60d8e329-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.426031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.426574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.436380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.446604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pk8\" (UniqueName: \"kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8\") pod \"keystone-cron-29415421-ppwmt\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.492864 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:00 crc kubenswrapper[4795]: I1205 09:01:00.951533 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415421-ppwmt"] Dec 05 09:01:00 crc kubenswrapper[4795]: W1205 09:01:00.955152 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb29b36d5_2393_4db6_a124_a9e2adc28069.slice/crio-5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807 WatchSource:0}: Error finding container 5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807: Status 404 returned error can't find the container with id 5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807 Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.184645 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwgqk" Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.184668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415421-ppwmt" event={"ID":"b29b36d5-2393-4db6-a124-a9e2adc28069","Type":"ContainerStarted","Data":"757062be973f6a222a6598cd02b4e1dee346acb96ab742cd753fee0a97da10fa"} Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.184727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415421-ppwmt" event={"ID":"b29b36d5-2393-4db6-a124-a9e2adc28069","Type":"ContainerStarted","Data":"5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807"} Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.209738 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415421-ppwmt" podStartSLOduration=1.209700512 podStartE2EDuration="1.209700512s" podCreationTimestamp="2025-12-05 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 09:01:01.206290329 +0000 UTC m=+2212.778894068" watchObservedRunningTime="2025-12-05 09:01:01.209700512 +0000 UTC m=+2212.782304271" Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.235542 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:01:01 crc kubenswrapper[4795]: I1205 09:01:01.247670 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwgqk"] Dec 05 09:01:02 crc kubenswrapper[4795]: I1205 09:01:02.763288 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" path="/var/lib/kubelet/pods/096e4f04-c680-4a72-ae03-5aef60d8e329/volumes" Dec 05 09:01:03 crc kubenswrapper[4795]: I1205 09:01:03.210933 4795 generic.go:334] "Generic (PLEG): container finished" podID="c21efe0b-8f08-49d0-9723-8497f78e7471" containerID="a43c36564b02df8bdcd68812e6e44f0d32330eb091af3edb30ab4551c1227b07" exitCode=0 Dec 05 09:01:03 crc kubenswrapper[4795]: I1205 09:01:03.210999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" event={"ID":"c21efe0b-8f08-49d0-9723-8497f78e7471","Type":"ContainerDied","Data":"a43c36564b02df8bdcd68812e6e44f0d32330eb091af3edb30ab4551c1227b07"} Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.222606 4795 generic.go:334] "Generic (PLEG): container finished" podID="b29b36d5-2393-4db6-a124-a9e2adc28069" containerID="757062be973f6a222a6598cd02b4e1dee346acb96ab742cd753fee0a97da10fa" exitCode=0 Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.222679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415421-ppwmt" event={"ID":"b29b36d5-2393-4db6-a124-a9e2adc28069","Type":"ContainerDied","Data":"757062be973f6a222a6598cd02b4e1dee346acb96ab742cd753fee0a97da10fa"} Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.709106 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.826556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key\") pod \"c21efe0b-8f08-49d0-9723-8497f78e7471\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.826846 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory\") pod \"c21efe0b-8f08-49d0-9723-8497f78e7471\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.826984 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ng9f\" (UniqueName: \"kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f\") pod \"c21efe0b-8f08-49d0-9723-8497f78e7471\" (UID: \"c21efe0b-8f08-49d0-9723-8497f78e7471\") " Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.833784 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f" (OuterVolumeSpecName: "kube-api-access-8ng9f") pod "c21efe0b-8f08-49d0-9723-8497f78e7471" (UID: "c21efe0b-8f08-49d0-9723-8497f78e7471"). InnerVolumeSpecName "kube-api-access-8ng9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.865435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c21efe0b-8f08-49d0-9723-8497f78e7471" (UID: "c21efe0b-8f08-49d0-9723-8497f78e7471"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.865872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory" (OuterVolumeSpecName: "inventory") pod "c21efe0b-8f08-49d0-9723-8497f78e7471" (UID: "c21efe0b-8f08-49d0-9723-8497f78e7471"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.930365 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ng9f\" (UniqueName: \"kubernetes.io/projected/c21efe0b-8f08-49d0-9723-8497f78e7471-kube-api-access-8ng9f\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.930449 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:04 crc kubenswrapper[4795]: I1205 09:01:04.930460 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c21efe0b-8f08-49d0-9723-8497f78e7471-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.237940 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.237976 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-k9222" event={"ID":"c21efe0b-8f08-49d0-9723-8497f78e7471","Type":"ContainerDied","Data":"05b6b6c349fd49bd2c3304fe9b4ca4098bec37a69eeac4ef778686751a107122"} Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.240182 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05b6b6c349fd49bd2c3304fe9b4ca4098bec37a69eeac4ef778686751a107122" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.358944 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589"] Dec 05 09:01:05 crc kubenswrapper[4795]: E1205 09:01:05.359448 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21efe0b-8f08-49d0-9723-8497f78e7471" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.359464 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21efe0b-8f08-49d0-9723-8497f78e7471" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:05 crc kubenswrapper[4795]: E1205 09:01:05.359490 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="extract-utilities" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.359497 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="extract-utilities" Dec 05 09:01:05 crc kubenswrapper[4795]: E1205 09:01:05.359506 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="registry-server" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.359513 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="registry-server" Dec 05 09:01:05 crc kubenswrapper[4795]: E1205 09:01:05.359546 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="extract-content" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.359554 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="extract-content" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.361946 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21efe0b-8f08-49d0-9723-8497f78e7471" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.361980 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="096e4f04-c680-4a72-ae03-5aef60d8e329" containerName="registry-server" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.362718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.374686 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.391888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.393242 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.402980 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.403211 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.416437 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.418079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.426564 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448798 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stb9d\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.448997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449052 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449159 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.449238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.463965 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589"] Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stb9d\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.551914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552084 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.552240 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.566841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.569917 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.570331 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.571598 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.571646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.573823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.573959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.577816 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.579420 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.579747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.580226 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.580642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.585351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.589180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stb9d\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2f589\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.596023 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.653479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys\") pod \"b29b36d5-2393-4db6-a124-a9e2adc28069\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.653561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7pk8\" (UniqueName: \"kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8\") pod \"b29b36d5-2393-4db6-a124-a9e2adc28069\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.653672 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data\") pod \"b29b36d5-2393-4db6-a124-a9e2adc28069\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.654017 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle\") pod \"b29b36d5-2393-4db6-a124-a9e2adc28069\" (UID: \"b29b36d5-2393-4db6-a124-a9e2adc28069\") " Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.663107 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b29b36d5-2393-4db6-a124-a9e2adc28069" (UID: "b29b36d5-2393-4db6-a124-a9e2adc28069"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.665515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8" (OuterVolumeSpecName: "kube-api-access-q7pk8") pod "b29b36d5-2393-4db6-a124-a9e2adc28069" (UID: "b29b36d5-2393-4db6-a124-a9e2adc28069"). InnerVolumeSpecName "kube-api-access-q7pk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.683578 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b29b36d5-2393-4db6-a124-a9e2adc28069" (UID: "b29b36d5-2393-4db6-a124-a9e2adc28069"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.714856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data" (OuterVolumeSpecName: "config-data") pod "b29b36d5-2393-4db6-a124-a9e2adc28069" (UID: "b29b36d5-2393-4db6-a124-a9e2adc28069"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.721512 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.757142 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.757177 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.757187 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7pk8\" (UniqueName: \"kubernetes.io/projected/b29b36d5-2393-4db6-a124-a9e2adc28069-kube-api-access-q7pk8\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:05 crc kubenswrapper[4795]: I1205 09:01:05.757199 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b29b36d5-2393-4db6-a124-a9e2adc28069-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:06 crc kubenswrapper[4795]: I1205 09:01:06.252734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415421-ppwmt" event={"ID":"b29b36d5-2393-4db6-a124-a9e2adc28069","Type":"ContainerDied","Data":"5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807"} Dec 05 09:01:06 crc kubenswrapper[4795]: I1205 09:01:06.253450 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3eb2266fa9b44f5462ea2e359089a90ff11dbbdab471770dd3339403d8f807" Dec 05 09:01:06 crc kubenswrapper[4795]: I1205 09:01:06.252778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415421-ppwmt" Dec 05 09:01:06 crc kubenswrapper[4795]: I1205 09:01:06.338152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589"] Dec 05 09:01:07 crc kubenswrapper[4795]: I1205 09:01:07.263716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" event={"ID":"74a4b975-a0ad-4798-8f20-2afce09644f9","Type":"ContainerStarted","Data":"11309d826388ce9b703b2f7f6afac589145af206bd585e2dd794a925202efa6e"} Dec 05 09:01:08 crc kubenswrapper[4795]: I1205 09:01:08.277964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" event={"ID":"74a4b975-a0ad-4798-8f20-2afce09644f9","Type":"ContainerStarted","Data":"6d5c4bafcd4ba3c42dca364f09b1485911456bd0fa143fc4b9f71c34c55b3eb9"} Dec 05 09:01:08 crc kubenswrapper[4795]: I1205 09:01:08.310071 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" podStartSLOduration=1.9152883109999999 podStartE2EDuration="3.310046647s" podCreationTimestamp="2025-12-05 09:01:05 +0000 UTC" firstStartedPulling="2025-12-05 09:01:06.343194377 +0000 UTC m=+2217.915798116" lastFinishedPulling="2025-12-05 09:01:07.737952713 +0000 UTC m=+2219.310556452" observedRunningTime="2025-12-05 09:01:08.303771267 +0000 UTC m=+2219.876375006" watchObservedRunningTime="2025-12-05 09:01:08.310046647 +0000 UTC m=+2219.882650386" Dec 05 09:01:47 crc kubenswrapper[4795]: I1205 09:01:47.696925 4795 generic.go:334] "Generic (PLEG): container finished" podID="74a4b975-a0ad-4798-8f20-2afce09644f9" containerID="6d5c4bafcd4ba3c42dca364f09b1485911456bd0fa143fc4b9f71c34c55b3eb9" exitCode=0 Dec 05 09:01:47 crc kubenswrapper[4795]: I1205 09:01:47.697548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" event={"ID":"74a4b975-a0ad-4798-8f20-2afce09644f9","Type":"ContainerDied","Data":"6d5c4bafcd4ba3c42dca364f09b1485911456bd0fa143fc4b9f71c34c55b3eb9"} Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.200822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397582 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stb9d\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397911 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397935 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.397962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398015 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398058 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398105 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398293 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.398340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key\") pod \"74a4b975-a0ad-4798-8f20-2afce09644f9\" (UID: \"74a4b975-a0ad-4798-8f20-2afce09644f9\") " Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.408799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.409041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d" (OuterVolumeSpecName: "kube-api-access-stb9d") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "kube-api-access-stb9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.409100 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.409720 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.409627 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.413910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.414180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.414885 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.415416 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.416258 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.420220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.425471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.456216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.482441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory" (OuterVolumeSpecName: "inventory") pod "74a4b975-a0ad-4798-8f20-2afce09644f9" (UID: "74a4b975-a0ad-4798-8f20-2afce09644f9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500520 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500565 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500577 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500588 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500599 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500636 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stb9d\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-kube-api-access-stb9d\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500647 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500656 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500665 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500674 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/74a4b975-a0ad-4798-8f20-2afce09644f9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500684 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500694 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500707 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.500717 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74a4b975-a0ad-4798-8f20-2afce09644f9-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.718079 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" event={"ID":"74a4b975-a0ad-4798-8f20-2afce09644f9","Type":"ContainerDied","Data":"11309d826388ce9b703b2f7f6afac589145af206bd585e2dd794a925202efa6e"} Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.718395 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11309d826388ce9b703b2f7f6afac589145af206bd585e2dd794a925202efa6e" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.718158 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2f589" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.865586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b"] Dec 05 09:01:49 crc kubenswrapper[4795]: E1205 09:01:49.867545 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74a4b975-a0ad-4798-8f20-2afce09644f9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.867570 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74a4b975-a0ad-4798-8f20-2afce09644f9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:49 crc kubenswrapper[4795]: E1205 09:01:49.867600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b29b36d5-2393-4db6-a124-a9e2adc28069" containerName="keystone-cron" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.867623 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b29b36d5-2393-4db6-a124-a9e2adc28069" containerName="keystone-cron" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.867829 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74a4b975-a0ad-4798-8f20-2afce09644f9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.867847 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b29b36d5-2393-4db6-a124-a9e2adc28069" containerName="keystone-cron" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.869305 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.871404 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.871841 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.872135 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.872393 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.873419 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:01:49 crc kubenswrapper[4795]: I1205 09:01:49.887679 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b"] Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.020438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9mm\" (UniqueName: \"kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.020878 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.021802 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.022054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.022210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.124043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.125044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.125223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.125366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9mm\" (UniqueName: \"kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.125467 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.126362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.130026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.130259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.131596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.148725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9mm\" (UniqueName: \"kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-md59b\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.227250 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:01:50 crc kubenswrapper[4795]: I1205 09:01:50.839934 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b"] Dec 05 09:01:51 crc kubenswrapper[4795]: I1205 09:01:51.740130 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" event={"ID":"3699268e-1a7d-4a95-9a21-538ddfff9e54","Type":"ContainerStarted","Data":"3b54bfdd7a33fc07bd97d924948a371302593abaea5de8fe2d7bb00ec340a344"} Dec 05 09:01:51 crc kubenswrapper[4795]: I1205 09:01:51.740536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" event={"ID":"3699268e-1a7d-4a95-9a21-538ddfff9e54","Type":"ContainerStarted","Data":"b7e6bfc0963b4c64e22ddfa4ef78392e517459dacea18fc6309be626fd9232ef"} Dec 05 09:01:51 crc kubenswrapper[4795]: I1205 09:01:51.766297 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" podStartSLOduration=2.5227409290000002 podStartE2EDuration="2.766267175s" podCreationTimestamp="2025-12-05 09:01:49 +0000 UTC" firstStartedPulling="2025-12-05 09:01:50.850155423 +0000 UTC m=+2262.422759162" lastFinishedPulling="2025-12-05 09:01:51.093681679 +0000 UTC m=+2262.666285408" observedRunningTime="2025-12-05 09:01:51.75795736 +0000 UTC m=+2263.330561099" watchObservedRunningTime="2025-12-05 09:01:51.766267175 +0000 UTC m=+2263.338870914" Dec 05 09:03:09 crc kubenswrapper[4795]: I1205 09:03:09.528417 4795 generic.go:334] "Generic (PLEG): container finished" podID="3699268e-1a7d-4a95-9a21-538ddfff9e54" containerID="3b54bfdd7a33fc07bd97d924948a371302593abaea5de8fe2d7bb00ec340a344" exitCode=0 Dec 05 09:03:09 crc kubenswrapper[4795]: I1205 09:03:09.528510 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" event={"ID":"3699268e-1a7d-4a95-9a21-538ddfff9e54","Type":"ContainerDied","Data":"3b54bfdd7a33fc07bd97d924948a371302593abaea5de8fe2d7bb00ec340a344"} Dec 05 09:03:10 crc kubenswrapper[4795]: I1205 09:03:10.827389 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:03:10 crc kubenswrapper[4795]: I1205 09:03:10.828746 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.014157 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.182445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9mm\" (UniqueName: \"kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm\") pod \"3699268e-1a7d-4a95-9a21-538ddfff9e54\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.182772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key\") pod \"3699268e-1a7d-4a95-9a21-538ddfff9e54\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.182843 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0\") pod \"3699268e-1a7d-4a95-9a21-538ddfff9e54\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.182967 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory\") pod \"3699268e-1a7d-4a95-9a21-538ddfff9e54\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.183028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle\") pod \"3699268e-1a7d-4a95-9a21-538ddfff9e54\" (UID: \"3699268e-1a7d-4a95-9a21-538ddfff9e54\") " Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.189412 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3699268e-1a7d-4a95-9a21-538ddfff9e54" (UID: "3699268e-1a7d-4a95-9a21-538ddfff9e54"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.202997 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm" (OuterVolumeSpecName: "kube-api-access-tw9mm") pod "3699268e-1a7d-4a95-9a21-538ddfff9e54" (UID: "3699268e-1a7d-4a95-9a21-538ddfff9e54"). InnerVolumeSpecName "kube-api-access-tw9mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.209501 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3699268e-1a7d-4a95-9a21-538ddfff9e54" (UID: "3699268e-1a7d-4a95-9a21-538ddfff9e54"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.220582 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3699268e-1a7d-4a95-9a21-538ddfff9e54" (UID: "3699268e-1a7d-4a95-9a21-538ddfff9e54"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.223778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory" (OuterVolumeSpecName: "inventory") pod "3699268e-1a7d-4a95-9a21-538ddfff9e54" (UID: "3699268e-1a7d-4a95-9a21-538ddfff9e54"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.286662 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.286698 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.286710 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.286719 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3699268e-1a7d-4a95-9a21-538ddfff9e54-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.286729 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9mm\" (UniqueName: \"kubernetes.io/projected/3699268e-1a7d-4a95-9a21-538ddfff9e54-kube-api-access-tw9mm\") on node \"crc\" DevicePath \"\"" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.548729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" event={"ID":"3699268e-1a7d-4a95-9a21-538ddfff9e54","Type":"ContainerDied","Data":"b7e6bfc0963b4c64e22ddfa4ef78392e517459dacea18fc6309be626fd9232ef"} Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.548784 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7e6bfc0963b4c64e22ddfa4ef78392e517459dacea18fc6309be626fd9232ef" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.548815 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-md59b" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.665130 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt"] Dec 05 09:03:11 crc kubenswrapper[4795]: E1205 09:03:11.665628 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3699268e-1a7d-4a95-9a21-538ddfff9e54" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.665660 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3699268e-1a7d-4a95-9a21-538ddfff9e54" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.665883 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3699268e-1a7d-4a95-9a21-538ddfff9e54" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.666690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.673834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.674061 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.674142 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.674189 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.674202 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.679061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt"] Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.682927 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.797890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.798448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.798494 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.798669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.798703 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqc4\" (UniqueName: \"kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.798779 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.900760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.900860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.900899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.901023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.901050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqc4\" (UniqueName: \"kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.901086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.909172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.910985 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.912473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.915159 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.921092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.921904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqc4\" (UniqueName: \"kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:11 crc kubenswrapper[4795]: I1205 09:03:11.987413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:03:12 crc kubenswrapper[4795]: I1205 09:03:12.606039 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:03:12 crc kubenswrapper[4795]: I1205 09:03:12.608812 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt"] Dec 05 09:03:13 crc kubenswrapper[4795]: I1205 09:03:13.576214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" event={"ID":"7ae18232-77c3-44cb-909e-fda5169b4d1c","Type":"ContainerStarted","Data":"719ffeb2d8bb71229ff06c7ba39d9e57917bb677a4c48ab57e58bf508b071ac7"} Dec 05 09:03:13 crc kubenswrapper[4795]: I1205 09:03:13.578160 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" event={"ID":"7ae18232-77c3-44cb-909e-fda5169b4d1c","Type":"ContainerStarted","Data":"8eb5740b0df44d703adef89a126fe68e0010d45be49e36d2cf4de004e9cb2ed0"} Dec 05 09:03:13 crc kubenswrapper[4795]: I1205 09:03:13.606265 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" podStartSLOduration=2.407882419 podStartE2EDuration="2.606239151s" podCreationTimestamp="2025-12-05 09:03:11 +0000 UTC" firstStartedPulling="2025-12-05 09:03:12.605751134 +0000 UTC m=+2344.178354873" lastFinishedPulling="2025-12-05 09:03:12.804107866 +0000 UTC m=+2344.376711605" observedRunningTime="2025-12-05 09:03:13.602012867 +0000 UTC m=+2345.174616606" watchObservedRunningTime="2025-12-05 09:03:13.606239151 +0000 UTC m=+2345.178842890" Dec 05 09:03:40 crc kubenswrapper[4795]: I1205 09:03:40.826978 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:03:40 crc kubenswrapper[4795]: I1205 09:03:40.827849 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.318226 4795 generic.go:334] "Generic (PLEG): container finished" podID="7ae18232-77c3-44cb-909e-fda5169b4d1c" containerID="719ffeb2d8bb71229ff06c7ba39d9e57917bb677a4c48ab57e58bf508b071ac7" exitCode=0 Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.318358 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" event={"ID":"7ae18232-77c3-44cb-909e-fda5169b4d1c","Type":"ContainerDied","Data":"719ffeb2d8bb71229ff06c7ba39d9e57917bb677a4c48ab57e58bf508b071ac7"} Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.827590 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.828087 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.828148 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.829103 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:04:10 crc kubenswrapper[4795]: I1205 09:04:10.829193 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" gracePeriod=600 Dec 05 09:04:10 crc kubenswrapper[4795]: E1205 09:04:10.963347 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:04:11 crc kubenswrapper[4795]: I1205 09:04:11.331023 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" exitCode=0 Dec 05 09:04:11 crc kubenswrapper[4795]: I1205 09:04:11.331242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5"} Dec 05 09:04:11 crc kubenswrapper[4795]: I1205 09:04:11.331285 4795 scope.go:117] "RemoveContainer" containerID="3bb8db58c6ff5eff08a8e0fa9aff2e776f8b94d253c5d66fa55553b15803d255" Dec 05 09:04:11 crc kubenswrapper[4795]: I1205 09:04:11.332026 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:04:11 crc kubenswrapper[4795]: E1205 09:04:11.332425 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:04:11 crc kubenswrapper[4795]: I1205 09:04:11.914092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.084387 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.084680 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.084747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.084832 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxqc4\" (UniqueName: \"kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.085660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.085694 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory\") pod \"7ae18232-77c3-44cb-909e-fda5169b4d1c\" (UID: \"7ae18232-77c3-44cb-909e-fda5169b4d1c\") " Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.092699 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.092942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4" (OuterVolumeSpecName: "kube-api-access-cxqc4") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "kube-api-access-cxqc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.119080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.121783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.122919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.123252 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory" (OuterVolumeSpecName: "inventory") pod "7ae18232-77c3-44cb-909e-fda5169b4d1c" (UID: "7ae18232-77c3-44cb-909e-fda5169b4d1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188527 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188564 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188576 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188586 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxqc4\" (UniqueName: \"kubernetes.io/projected/7ae18232-77c3-44cb-909e-fda5169b4d1c-kube-api-access-cxqc4\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188597 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.188605 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae18232-77c3-44cb-909e-fda5169b4d1c-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.344005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" event={"ID":"7ae18232-77c3-44cb-909e-fda5169b4d1c","Type":"ContainerDied","Data":"8eb5740b0df44d703adef89a126fe68e0010d45be49e36d2cf4de004e9cb2ed0"} Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.344047 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.344060 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8eb5740b0df44d703adef89a126fe68e0010d45be49e36d2cf4de004e9cb2ed0" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.476937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9"] Dec 05 09:04:12 crc kubenswrapper[4795]: E1205 09:04:12.477455 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae18232-77c3-44cb-909e-fda5169b4d1c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.477478 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae18232-77c3-44cb-909e-fda5169b4d1c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.480054 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae18232-77c3-44cb-909e-fda5169b4d1c" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.481049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.486523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.486834 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.487113 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.487319 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.487475 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.498553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9"] Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.598222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjbpz\" (UniqueName: \"kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.598307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.598340 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.598415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.598455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.699901 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.700246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.700374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.700465 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.700628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjbpz\" (UniqueName: \"kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.706294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.706860 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.708714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.711550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.722672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjbpz\" (UniqueName: \"kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:12 crc kubenswrapper[4795]: I1205 09:04:12.812057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:04:13 crc kubenswrapper[4795]: I1205 09:04:13.408553 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9"] Dec 05 09:04:14 crc kubenswrapper[4795]: I1205 09:04:14.370267 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" event={"ID":"cfe4932a-495e-46cb-981d-71465ed7e1ff","Type":"ContainerStarted","Data":"a58d983b698c0a43cd04c2a1e13ba2c6653aaf6b92d8e95992a38b83acf8f958"} Dec 05 09:04:14 crc kubenswrapper[4795]: I1205 09:04:14.371067 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" event={"ID":"cfe4932a-495e-46cb-981d-71465ed7e1ff","Type":"ContainerStarted","Data":"0404bc1d2960fbee5bcfecedfead20e5108f2e372e9bfc0b915ecced5000801e"} Dec 05 09:04:14 crc kubenswrapper[4795]: I1205 09:04:14.400001 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" podStartSLOduration=2.214683227 podStartE2EDuration="2.399972665s" podCreationTimestamp="2025-12-05 09:04:12 +0000 UTC" firstStartedPulling="2025-12-05 09:04:13.423520619 +0000 UTC m=+2404.996124358" lastFinishedPulling="2025-12-05 09:04:13.608810057 +0000 UTC m=+2405.181413796" observedRunningTime="2025-12-05 09:04:14.395411481 +0000 UTC m=+2405.968015240" watchObservedRunningTime="2025-12-05 09:04:14.399972665 +0000 UTC m=+2405.972576414" Dec 05 09:04:21 crc kubenswrapper[4795]: I1205 09:04:21.748058 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:04:21 crc kubenswrapper[4795]: E1205 09:04:21.749053 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:04:33 crc kubenswrapper[4795]: I1205 09:04:33.748285 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:04:33 crc kubenswrapper[4795]: E1205 09:04:33.749395 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:04:45 crc kubenswrapper[4795]: I1205 09:04:45.747946 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:04:45 crc kubenswrapper[4795]: E1205 09:04:45.748962 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:04:57 crc kubenswrapper[4795]: I1205 09:04:57.747547 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:04:57 crc kubenswrapper[4795]: E1205 09:04:57.748464 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:05:10 crc kubenswrapper[4795]: I1205 09:05:10.747492 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:05:10 crc kubenswrapper[4795]: E1205 09:05:10.748744 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:05:23 crc kubenswrapper[4795]: I1205 09:05:23.748340 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:05:23 crc kubenswrapper[4795]: E1205 09:05:23.749551 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:05:35 crc kubenswrapper[4795]: I1205 09:05:35.748558 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:05:35 crc kubenswrapper[4795]: E1205 09:05:35.749899 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:05:49 crc kubenswrapper[4795]: I1205 09:05:49.747951 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:05:49 crc kubenswrapper[4795]: E1205 09:05:49.748933 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:02 crc kubenswrapper[4795]: I1205 09:06:02.747398 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:06:02 crc kubenswrapper[4795]: E1205 09:06:02.748639 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:15 crc kubenswrapper[4795]: I1205 09:06:15.747835 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:06:15 crc kubenswrapper[4795]: E1205 09:06:15.748717 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:27 crc kubenswrapper[4795]: I1205 09:06:27.747170 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:06:27 crc kubenswrapper[4795]: E1205 09:06:27.748386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:40 crc kubenswrapper[4795]: I1205 09:06:40.749311 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:06:40 crc kubenswrapper[4795]: E1205 09:06:40.750393 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:52 crc kubenswrapper[4795]: I1205 09:06:52.747945 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:06:52 crc kubenswrapper[4795]: E1205 09:06:52.749098 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:06:52 crc kubenswrapper[4795]: I1205 09:06:52.983128 4795 scope.go:117] "RemoveContainer" containerID="e8365d5c7c2e951d822c3c6f362f81a602d970a0b707dbd25aaaa36b84b241db" Dec 05 09:06:53 crc kubenswrapper[4795]: I1205 09:06:53.007925 4795 scope.go:117] "RemoveContainer" containerID="f01f6023497a99bc17ae1d9b7a7ca4ca57c82f33ca06c676da61f144449cf380" Dec 05 09:06:53 crc kubenswrapper[4795]: I1205 09:06:53.052969 4795 scope.go:117] "RemoveContainer" containerID="314075f41e944df5fe962d78d9b31f9f0f35dbba8e520612395965798da76115" Dec 05 09:07:07 crc kubenswrapper[4795]: I1205 09:07:07.770047 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:07:07 crc kubenswrapper[4795]: E1205 09:07:07.771124 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:07:19 crc kubenswrapper[4795]: I1205 09:07:19.747867 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:07:19 crc kubenswrapper[4795]: E1205 09:07:19.748854 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:07:33 crc kubenswrapper[4795]: I1205 09:07:33.747723 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:07:33 crc kubenswrapper[4795]: E1205 09:07:33.748796 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:07:45 crc kubenswrapper[4795]: I1205 09:07:45.747871 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:07:45 crc kubenswrapper[4795]: E1205 09:07:45.749002 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:07:57 crc kubenswrapper[4795]: I1205 09:07:57.747176 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:07:57 crc kubenswrapper[4795]: E1205 09:07:57.748183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:08:08 crc kubenswrapper[4795]: I1205 09:08:08.747852 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:08:08 crc kubenswrapper[4795]: E1205 09:08:08.748871 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:08:22 crc kubenswrapper[4795]: I1205 09:08:22.747947 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:08:22 crc kubenswrapper[4795]: E1205 09:08:22.748775 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:08:33 crc kubenswrapper[4795]: I1205 09:08:33.748152 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:08:33 crc kubenswrapper[4795]: E1205 09:08:33.749256 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:08:44 crc kubenswrapper[4795]: I1205 09:08:44.251951 4795 generic.go:334] "Generic (PLEG): container finished" podID="cfe4932a-495e-46cb-981d-71465ed7e1ff" containerID="a58d983b698c0a43cd04c2a1e13ba2c6653aaf6b92d8e95992a38b83acf8f958" exitCode=0 Dec 05 09:08:44 crc kubenswrapper[4795]: I1205 09:08:44.252026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" event={"ID":"cfe4932a-495e-46cb-981d-71465ed7e1ff","Type":"ContainerDied","Data":"a58d983b698c0a43cd04c2a1e13ba2c6653aaf6b92d8e95992a38b83acf8f958"} Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.750890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.942765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0\") pod \"cfe4932a-495e-46cb-981d-71465ed7e1ff\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.942896 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjbpz\" (UniqueName: \"kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz\") pod \"cfe4932a-495e-46cb-981d-71465ed7e1ff\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.942958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory\") pod \"cfe4932a-495e-46cb-981d-71465ed7e1ff\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.942986 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key\") pod \"cfe4932a-495e-46cb-981d-71465ed7e1ff\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.943029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle\") pod \"cfe4932a-495e-46cb-981d-71465ed7e1ff\" (UID: \"cfe4932a-495e-46cb-981d-71465ed7e1ff\") " Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.950825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cfe4932a-495e-46cb-981d-71465ed7e1ff" (UID: "cfe4932a-495e-46cb-981d-71465ed7e1ff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.951938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz" (OuterVolumeSpecName: "kube-api-access-xjbpz") pod "cfe4932a-495e-46cb-981d-71465ed7e1ff" (UID: "cfe4932a-495e-46cb-981d-71465ed7e1ff"). InnerVolumeSpecName "kube-api-access-xjbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.975877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "cfe4932a-495e-46cb-981d-71465ed7e1ff" (UID: "cfe4932a-495e-46cb-981d-71465ed7e1ff"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.978898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory" (OuterVolumeSpecName: "inventory") pod "cfe4932a-495e-46cb-981d-71465ed7e1ff" (UID: "cfe4932a-495e-46cb-981d-71465ed7e1ff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:08:45 crc kubenswrapper[4795]: I1205 09:08:45.986057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cfe4932a-495e-46cb-981d-71465ed7e1ff" (UID: "cfe4932a-495e-46cb-981d-71465ed7e1ff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.045221 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.045375 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjbpz\" (UniqueName: \"kubernetes.io/projected/cfe4932a-495e-46cb-981d-71465ed7e1ff-kube-api-access-xjbpz\") on node \"crc\" DevicePath \"\"" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.045444 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.045499 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.045559 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfe4932a-495e-46cb-981d-71465ed7e1ff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.272800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" event={"ID":"cfe4932a-495e-46cb-981d-71465ed7e1ff","Type":"ContainerDied","Data":"0404bc1d2960fbee5bcfecedfead20e5108f2e372e9bfc0b915ecced5000801e"} Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.273180 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0404bc1d2960fbee5bcfecedfead20e5108f2e372e9bfc0b915ecced5000801e" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.272928 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.450276 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq"] Dec 05 09:08:46 crc kubenswrapper[4795]: E1205 09:08:46.450777 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe4932a-495e-46cb-981d-71465ed7e1ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.450797 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe4932a-495e-46cb-981d-71465ed7e1ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.451052 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe4932a-495e-46cb-981d-71465ed7e1ff" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.451825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.454400 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.455359 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.457033 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.457205 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.457424 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.460929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.461218 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.468334 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq"] Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.563940 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564048 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9zg\" (UniqueName: \"kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564075 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.564532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666374 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9zg\" (UniqueName: \"kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666450 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666601 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.666702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.668010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.671152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.671389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.672080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.672961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.673535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.681901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.681996 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.685675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9zg\" (UniqueName: \"kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-m5khq\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.747601 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:08:46 crc kubenswrapper[4795]: E1205 09:08:46.748016 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:08:46 crc kubenswrapper[4795]: I1205 09:08:46.815937 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:08:47 crc kubenswrapper[4795]: I1205 09:08:47.443755 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq"] Dec 05 09:08:47 crc kubenswrapper[4795]: I1205 09:08:47.452916 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:08:48 crc kubenswrapper[4795]: I1205 09:08:48.296097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" event={"ID":"6fca34cb-0c72-422c-86e9-638584bb9dcb","Type":"ContainerStarted","Data":"8facf4e25635888dabff1790f496fc07a408f337e1e4e844a8d57da1e9920888"} Dec 05 09:08:48 crc kubenswrapper[4795]: I1205 09:08:48.296576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" event={"ID":"6fca34cb-0c72-422c-86e9-638584bb9dcb","Type":"ContainerStarted","Data":"c922b6a190b1a8dab734501d7effaa307463e5963982d8620d86478109aa1666"} Dec 05 09:08:48 crc kubenswrapper[4795]: I1205 09:08:48.327438 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" podStartSLOduration=2.142840392 podStartE2EDuration="2.327411205s" podCreationTimestamp="2025-12-05 09:08:46 +0000 UTC" firstStartedPulling="2025-12-05 09:08:47.452660041 +0000 UTC m=+2679.025263780" lastFinishedPulling="2025-12-05 09:08:47.637230854 +0000 UTC m=+2679.209834593" observedRunningTime="2025-12-05 09:08:48.321040942 +0000 UTC m=+2679.893644701" watchObservedRunningTime="2025-12-05 09:08:48.327411205 +0000 UTC m=+2679.900014934" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.634957 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.638258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.660473 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.685243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.685711 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.685775 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqhrv\" (UniqueName: \"kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.788252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.788492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqhrv\" (UniqueName: \"kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.788556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.789440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.789519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.813003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqhrv\" (UniqueName: \"kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv\") pod \"redhat-marketplace-htqhl\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:55 crc kubenswrapper[4795]: I1205 09:08:55.965747 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:08:56 crc kubenswrapper[4795]: I1205 09:08:56.582244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:08:57 crc kubenswrapper[4795]: I1205 09:08:57.389839 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerID="bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545" exitCode=0 Dec 05 09:08:57 crc kubenswrapper[4795]: I1205 09:08:57.389945 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerDied","Data":"bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545"} Dec 05 09:08:57 crc kubenswrapper[4795]: I1205 09:08:57.390449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerStarted","Data":"d140e702cd33acf6695ae29ead1c5eafef37558bfe0ac47d242583221a63dc7c"} Dec 05 09:08:58 crc kubenswrapper[4795]: I1205 09:08:58.404129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerStarted","Data":"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1"} Dec 05 09:08:59 crc kubenswrapper[4795]: I1205 09:08:59.418720 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerID="75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1" exitCode=0 Dec 05 09:08:59 crc kubenswrapper[4795]: I1205 09:08:59.418955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerDied","Data":"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1"} Dec 05 09:08:59 crc kubenswrapper[4795]: I1205 09:08:59.747601 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:08:59 crc kubenswrapper[4795]: E1205 09:08:59.747948 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:09:00 crc kubenswrapper[4795]: I1205 09:09:00.438928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerStarted","Data":"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400"} Dec 05 09:09:00 crc kubenswrapper[4795]: I1205 09:09:00.460478 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-htqhl" podStartSLOduration=3.011258111 podStartE2EDuration="5.460454506s" podCreationTimestamp="2025-12-05 09:08:55 +0000 UTC" firstStartedPulling="2025-12-05 09:08:57.391768212 +0000 UTC m=+2688.964371951" lastFinishedPulling="2025-12-05 09:08:59.840964597 +0000 UTC m=+2691.413568346" observedRunningTime="2025-12-05 09:09:00.457336531 +0000 UTC m=+2692.029940270" watchObservedRunningTime="2025-12-05 09:09:00.460454506 +0000 UTC m=+2692.033058245" Dec 05 09:09:05 crc kubenswrapper[4795]: I1205 09:09:05.966765 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:05 crc kubenswrapper[4795]: I1205 09:09:05.967660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:06 crc kubenswrapper[4795]: I1205 09:09:06.019038 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:06 crc kubenswrapper[4795]: I1205 09:09:06.545269 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:06 crc kubenswrapper[4795]: I1205 09:09:06.607384 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:09:08 crc kubenswrapper[4795]: I1205 09:09:08.513272 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-htqhl" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="registry-server" containerID="cri-o://8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400" gracePeriod=2 Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.019777 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.195164 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqhrv\" (UniqueName: \"kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv\") pod \"9f8a1298-aa53-4a07-9549-af3203c61a0f\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.195284 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities\") pod \"9f8a1298-aa53-4a07-9549-af3203c61a0f\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.195603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content\") pod \"9f8a1298-aa53-4a07-9549-af3203c61a0f\" (UID: \"9f8a1298-aa53-4a07-9549-af3203c61a0f\") " Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.196709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities" (OuterVolumeSpecName: "utilities") pod "9f8a1298-aa53-4a07-9549-af3203c61a0f" (UID: "9f8a1298-aa53-4a07-9549-af3203c61a0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.203495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv" (OuterVolumeSpecName: "kube-api-access-cqhrv") pod "9f8a1298-aa53-4a07-9549-af3203c61a0f" (UID: "9f8a1298-aa53-4a07-9549-af3203c61a0f"). InnerVolumeSpecName "kube-api-access-cqhrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.216108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f8a1298-aa53-4a07-9549-af3203c61a0f" (UID: "9f8a1298-aa53-4a07-9549-af3203c61a0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.297982 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.298297 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqhrv\" (UniqueName: \"kubernetes.io/projected/9f8a1298-aa53-4a07-9549-af3203c61a0f-kube-api-access-cqhrv\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.298386 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f8a1298-aa53-4a07-9549-af3203c61a0f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.526533 4795 generic.go:334] "Generic (PLEG): container finished" podID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerID="8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400" exitCode=0 Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.526583 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerDied","Data":"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400"} Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.526666 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htqhl" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.527330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htqhl" event={"ID":"9f8a1298-aa53-4a07-9549-af3203c61a0f","Type":"ContainerDied","Data":"d140e702cd33acf6695ae29ead1c5eafef37558bfe0ac47d242583221a63dc7c"} Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.527368 4795 scope.go:117] "RemoveContainer" containerID="8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.572681 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.576730 4795 scope.go:117] "RemoveContainer" containerID="75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.581533 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-htqhl"] Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.609317 4795 scope.go:117] "RemoveContainer" containerID="bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.660467 4795 scope.go:117] "RemoveContainer" containerID="8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400" Dec 05 09:09:09 crc kubenswrapper[4795]: E1205 09:09:09.660992 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400\": container with ID starting with 8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400 not found: ID does not exist" containerID="8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.661038 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400"} err="failed to get container status \"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400\": rpc error: code = NotFound desc = could not find container \"8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400\": container with ID starting with 8a928ca3b4f143f3cdd90e57bd6d32a7c2a7738438789b16ac64189ff12c2400 not found: ID does not exist" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.661073 4795 scope.go:117] "RemoveContainer" containerID="75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1" Dec 05 09:09:09 crc kubenswrapper[4795]: E1205 09:09:09.661534 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1\": container with ID starting with 75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1 not found: ID does not exist" containerID="75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.661586 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1"} err="failed to get container status \"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1\": rpc error: code = NotFound desc = could not find container \"75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1\": container with ID starting with 75eff9e55bb1bdfba9f852f72a5391a46a1bbbe2b553e30fb8a3989f0b30f8f1 not found: ID does not exist" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.661684 4795 scope.go:117] "RemoveContainer" containerID="bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545" Dec 05 09:09:09 crc kubenswrapper[4795]: E1205 09:09:09.662245 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545\": container with ID starting with bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545 not found: ID does not exist" containerID="bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545" Dec 05 09:09:09 crc kubenswrapper[4795]: I1205 09:09:09.662271 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545"} err="failed to get container status \"bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545\": rpc error: code = NotFound desc = could not find container \"bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545\": container with ID starting with bc7530d530522c9a30a0170461246d549c1031611807988282b18edbd72a7545 not found: ID does not exist" Dec 05 09:09:10 crc kubenswrapper[4795]: I1205 09:09:10.761499 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" path="/var/lib/kubelet/pods/9f8a1298-aa53-4a07-9549-af3203c61a0f/volumes" Dec 05 09:09:11 crc kubenswrapper[4795]: I1205 09:09:11.747635 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:09:12 crc kubenswrapper[4795]: I1205 09:09:12.561802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8"} Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.409178 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhq5q"] Dec 05 09:09:13 crc kubenswrapper[4795]: E1205 09:09:13.410203 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="registry-server" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.410226 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="registry-server" Dec 05 09:09:13 crc kubenswrapper[4795]: E1205 09:09:13.410257 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="extract-content" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.410264 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="extract-content" Dec 05 09:09:13 crc kubenswrapper[4795]: E1205 09:09:13.410291 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="extract-utilities" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.410298 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="extract-utilities" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.410532 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8a1298-aa53-4a07-9549-af3203c61a0f" containerName="registry-server" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.412545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.426995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhq5q"] Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.496143 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vct29\" (UniqueName: \"kubernetes.io/projected/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-kube-api-access-vct29\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.496720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-utilities\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.496810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-catalog-content\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.599836 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-utilities\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.599125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-utilities\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.600053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-catalog-content\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.600256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vct29\" (UniqueName: \"kubernetes.io/projected/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-kube-api-access-vct29\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.600995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-catalog-content\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.627365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vct29\" (UniqueName: \"kubernetes.io/projected/a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e-kube-api-access-vct29\") pod \"community-operators-qhq5q\" (UID: \"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e\") " pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:13 crc kubenswrapper[4795]: I1205 09:09:13.743544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:14 crc kubenswrapper[4795]: I1205 09:09:14.426267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhq5q"] Dec 05 09:09:14 crc kubenswrapper[4795]: I1205 09:09:14.584984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhq5q" event={"ID":"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e","Type":"ContainerStarted","Data":"5ef87c00c0e1c5ec5766170d71cc7b27ce6d262970850a031fe038f1d0818370"} Dec 05 09:09:15 crc kubenswrapper[4795]: I1205 09:09:15.605489 4795 generic.go:334] "Generic (PLEG): container finished" podID="a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e" containerID="997f1e9b46f4cefba862aa88476b0329c702663803e48139ace737c90de4841e" exitCode=0 Dec 05 09:09:15 crc kubenswrapper[4795]: I1205 09:09:15.605597 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhq5q" event={"ID":"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e","Type":"ContainerDied","Data":"997f1e9b46f4cefba862aa88476b0329c702663803e48139ace737c90de4841e"} Dec 05 09:09:21 crc kubenswrapper[4795]: I1205 09:09:21.682320 4795 generic.go:334] "Generic (PLEG): container finished" podID="a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e" containerID="fcf014ae6e816e7438e176a730cb58ac4bbc3c13dd593306b4bcebfe4988ba11" exitCode=0 Dec 05 09:09:21 crc kubenswrapper[4795]: I1205 09:09:21.682457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhq5q" event={"ID":"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e","Type":"ContainerDied","Data":"fcf014ae6e816e7438e176a730cb58ac4bbc3c13dd593306b4bcebfe4988ba11"} Dec 05 09:09:24 crc kubenswrapper[4795]: I1205 09:09:24.713869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhq5q" event={"ID":"a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e","Type":"ContainerStarted","Data":"9a1351f529b5db006e956e721ce5920d762db9e4e693f8734879529705d35a5a"} Dec 05 09:09:24 crc kubenswrapper[4795]: I1205 09:09:24.757890 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhq5q" podStartSLOduration=3.403504685 podStartE2EDuration="11.757849516s" podCreationTimestamp="2025-12-05 09:09:13 +0000 UTC" firstStartedPulling="2025-12-05 09:09:15.608540438 +0000 UTC m=+2707.181144177" lastFinishedPulling="2025-12-05 09:09:23.962885279 +0000 UTC m=+2715.535489008" observedRunningTime="2025-12-05 09:09:24.735030669 +0000 UTC m=+2716.307634418" watchObservedRunningTime="2025-12-05 09:09:24.757849516 +0000 UTC m=+2716.330453265" Dec 05 09:09:33 crc kubenswrapper[4795]: I1205 09:09:33.744592 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:33 crc kubenswrapper[4795]: I1205 09:09:33.745505 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:33 crc kubenswrapper[4795]: I1205 09:09:33.801255 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:33 crc kubenswrapper[4795]: I1205 09:09:33.873292 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhq5q" Dec 05 09:09:33 crc kubenswrapper[4795]: I1205 09:09:33.961899 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhq5q"] Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.054274 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.054952 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9b5qq" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="registry-server" containerID="cri-o://bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16" gracePeriod=2 Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.746514 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.850857 4795 generic.go:334] "Generic (PLEG): container finished" podID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerID="bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16" exitCode=0 Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.851422 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9b5qq" Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.851647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerDied","Data":"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16"} Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.851752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9b5qq" event={"ID":"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d","Type":"ContainerDied","Data":"51fd62308d8d4eda05cfa629efa29824f1409d6ea1908e75b3f34a5f0ba2d0c7"} Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.851777 4795 scope.go:117] "RemoveContainer" containerID="bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16" Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.922489 4795 scope.go:117] "RemoveContainer" containerID="7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf" Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.926660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnddc\" (UniqueName: \"kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc\") pod \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.926735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities\") pod \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.927050 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content\") pod \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\" (UID: \"6f7765ad-5efc-4453-b7c3-a19ca91c5e5d\") " Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.931742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities" (OuterVolumeSpecName: "utilities") pod "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" (UID: "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:09:34 crc kubenswrapper[4795]: I1205 09:09:34.971927 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc" (OuterVolumeSpecName: "kube-api-access-qnddc") pod "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" (UID: "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d"). InnerVolumeSpecName "kube-api-access-qnddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.034192 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnddc\" (UniqueName: \"kubernetes.io/projected/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-kube-api-access-qnddc\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.034240 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.054726 4795 scope.go:117] "RemoveContainer" containerID="e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.080469 4795 scope.go:117] "RemoveContainer" containerID="bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16" Dec 05 09:09:35 crc kubenswrapper[4795]: E1205 09:09:35.081370 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16\": container with ID starting with bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16 not found: ID does not exist" containerID="bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.081434 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16"} err="failed to get container status \"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16\": rpc error: code = NotFound desc = could not find container \"bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16\": container with ID starting with bb571483ff45cef9bd184e6ae7b9a87f893607a6997f5bdf9b97aeea1e3bab16 not found: ID does not exist" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.081471 4795 scope.go:117] "RemoveContainer" containerID="7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf" Dec 05 09:09:35 crc kubenswrapper[4795]: E1205 09:09:35.082158 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf\": container with ID starting with 7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf not found: ID does not exist" containerID="7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.082222 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf"} err="failed to get container status \"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf\": rpc error: code = NotFound desc = could not find container \"7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf\": container with ID starting with 7fd031bd8031a5a30615576f3d5a381dd4d6bc45b297e0d34ae9cb90919c4adf not found: ID does not exist" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.082261 4795 scope.go:117] "RemoveContainer" containerID="e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc" Dec 05 09:09:35 crc kubenswrapper[4795]: E1205 09:09:35.082730 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc\": container with ID starting with e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc not found: ID does not exist" containerID="e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.082785 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc"} err="failed to get container status \"e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc\": rpc error: code = NotFound desc = could not find container \"e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc\": container with ID starting with e5c44203b24930637137257be66693b5692a26f21476cbdfecade434249d20dc not found: ID does not exist" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.193289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" (UID: "6f7765ad-5efc-4453-b7c3-a19ca91c5e5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.239382 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.497913 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 09:09:35 crc kubenswrapper[4795]: I1205 09:09:35.512113 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9b5qq"] Dec 05 09:09:36 crc kubenswrapper[4795]: I1205 09:09:36.762254 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" path="/var/lib/kubelet/pods/6f7765ad-5efc-4453-b7c3-a19ca91c5e5d/volumes" Dec 05 09:11:40 crc kubenswrapper[4795]: I1205 09:11:40.826825 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:11:40 crc kubenswrapper[4795]: I1205 09:11:40.829418 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.510836 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:43 crc kubenswrapper[4795]: E1205 09:11:43.512222 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="registry-server" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.512240 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="registry-server" Dec 05 09:11:43 crc kubenswrapper[4795]: E1205 09:11:43.512258 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="extract-content" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.512264 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="extract-content" Dec 05 09:11:43 crc kubenswrapper[4795]: E1205 09:11:43.512292 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="extract-utilities" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.512298 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="extract-utilities" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.512533 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7765ad-5efc-4453-b7c3-a19ca91c5e5d" containerName="registry-server" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.514036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.542340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.613390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.613929 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7v5m\" (UniqueName: \"kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.614091 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.716340 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.716420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7v5m\" (UniqueName: \"kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.716457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.716988 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.717066 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.742850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7v5m\" (UniqueName: \"kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m\") pod \"certified-operators-xr6z8\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:43 crc kubenswrapper[4795]: I1205 09:11:43.843308 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:44 crc kubenswrapper[4795]: I1205 09:11:44.566094 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:44 crc kubenswrapper[4795]: W1205 09:11:44.591847 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee45036_89e0_4101_9463_524e1bcb4486.slice/crio-b5af15bfe1b69c897d102e5122095aa0c9f66e5a532d9f07c2fae01f8c6ae2ae WatchSource:0}: Error finding container b5af15bfe1b69c897d102e5122095aa0c9f66e5a532d9f07c2fae01f8c6ae2ae: Status 404 returned error can't find the container with id b5af15bfe1b69c897d102e5122095aa0c9f66e5a532d9f07c2fae01f8c6ae2ae Dec 05 09:11:45 crc kubenswrapper[4795]: I1205 09:11:45.212889 4795 generic.go:334] "Generic (PLEG): container finished" podID="cee45036-89e0-4101-9463-524e1bcb4486" containerID="168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18" exitCode=0 Dec 05 09:11:45 crc kubenswrapper[4795]: I1205 09:11:45.213066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerDied","Data":"168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18"} Dec 05 09:11:45 crc kubenswrapper[4795]: I1205 09:11:45.213325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerStarted","Data":"b5af15bfe1b69c897d102e5122095aa0c9f66e5a532d9f07c2fae01f8c6ae2ae"} Dec 05 09:11:46 crc kubenswrapper[4795]: I1205 09:11:46.226291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerStarted","Data":"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704"} Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.240205 4795 generic.go:334] "Generic (PLEG): container finished" podID="cee45036-89e0-4101-9463-524e1bcb4486" containerID="8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704" exitCode=0 Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.240320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerDied","Data":"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704"} Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.713895 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.716589 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.736247 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.825162 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkf5\" (UniqueName: \"kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.825510 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.825838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.928780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkf5\" (UniqueName: \"kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.928856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.928937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.929673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.929724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:47 crc kubenswrapper[4795]: I1205 09:11:47.952908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkf5\" (UniqueName: \"kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5\") pod \"redhat-operators-nxgk4\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:48 crc kubenswrapper[4795]: I1205 09:11:48.128576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:48 crc kubenswrapper[4795]: I1205 09:11:48.266467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerStarted","Data":"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef"} Dec 05 09:11:48 crc kubenswrapper[4795]: I1205 09:11:48.300400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xr6z8" podStartSLOduration=2.876915438 podStartE2EDuration="5.300369452s" podCreationTimestamp="2025-12-05 09:11:43 +0000 UTC" firstStartedPulling="2025-12-05 09:11:45.216293227 +0000 UTC m=+2856.788896966" lastFinishedPulling="2025-12-05 09:11:47.639747231 +0000 UTC m=+2859.212350980" observedRunningTime="2025-12-05 09:11:48.289385965 +0000 UTC m=+2859.861989714" watchObservedRunningTime="2025-12-05 09:11:48.300369452 +0000 UTC m=+2859.872973191" Dec 05 09:11:48 crc kubenswrapper[4795]: I1205 09:11:48.841142 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:11:49 crc kubenswrapper[4795]: I1205 09:11:49.286453 4795 generic.go:334] "Generic (PLEG): container finished" podID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerID="96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b" exitCode=0 Dec 05 09:11:49 crc kubenswrapper[4795]: I1205 09:11:49.288628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerDied","Data":"96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b"} Dec 05 09:11:49 crc kubenswrapper[4795]: I1205 09:11:49.289192 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerStarted","Data":"5ac6f5597c15fe12e8ce26e0a1a5c03d25a289f4fa580bcd621a8275436f94b9"} Dec 05 09:11:50 crc kubenswrapper[4795]: I1205 09:11:50.303298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerStarted","Data":"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39"} Dec 05 09:11:53 crc kubenswrapper[4795]: I1205 09:11:53.691067 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:11:53 crc kubenswrapper[4795]: I1205 09:11:53.692353 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:11:53 crc kubenswrapper[4795]: I1205 09:11:53.844568 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:53 crc kubenswrapper[4795]: I1205 09:11:53.844660 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:53 crc kubenswrapper[4795]: I1205 09:11:53.911316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:54 crc kubenswrapper[4795]: I1205 09:11:54.412503 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:54 crc kubenswrapper[4795]: I1205 09:11:54.699578 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.384868 4795 generic.go:334] "Generic (PLEG): container finished" podID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerID="d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39" exitCode=0 Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.385000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerDied","Data":"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39"} Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.385706 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xr6z8" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="registry-server" containerID="cri-o://996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef" gracePeriod=2 Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.880636 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.935917 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content\") pod \"cee45036-89e0-4101-9463-524e1bcb4486\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.936236 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7v5m\" (UniqueName: \"kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m\") pod \"cee45036-89e0-4101-9463-524e1bcb4486\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.936449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities\") pod \"cee45036-89e0-4101-9463-524e1bcb4486\" (UID: \"cee45036-89e0-4101-9463-524e1bcb4486\") " Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.938189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities" (OuterVolumeSpecName: "utilities") pod "cee45036-89e0-4101-9463-524e1bcb4486" (UID: "cee45036-89e0-4101-9463-524e1bcb4486"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.946718 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m" (OuterVolumeSpecName: "kube-api-access-v7v5m") pod "cee45036-89e0-4101-9463-524e1bcb4486" (UID: "cee45036-89e0-4101-9463-524e1bcb4486"). InnerVolumeSpecName "kube-api-access-v7v5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:11:56 crc kubenswrapper[4795]: I1205 09:11:56.989211 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cee45036-89e0-4101-9463-524e1bcb4486" (UID: "cee45036-89e0-4101-9463-524e1bcb4486"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.040785 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7v5m\" (UniqueName: \"kubernetes.io/projected/cee45036-89e0-4101-9463-524e1bcb4486-kube-api-access-v7v5m\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.040944 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.041056 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cee45036-89e0-4101-9463-524e1bcb4486-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.398417 4795 generic.go:334] "Generic (PLEG): container finished" podID="cee45036-89e0-4101-9463-524e1bcb4486" containerID="996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef" exitCode=0 Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.398522 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xr6z8" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.398529 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerDied","Data":"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef"} Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.399084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xr6z8" event={"ID":"cee45036-89e0-4101-9463-524e1bcb4486","Type":"ContainerDied","Data":"b5af15bfe1b69c897d102e5122095aa0c9f66e5a532d9f07c2fae01f8c6ae2ae"} Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.399148 4795 scope.go:117] "RemoveContainer" containerID="996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.401353 4795 generic.go:334] "Generic (PLEG): container finished" podID="6fca34cb-0c72-422c-86e9-638584bb9dcb" containerID="8facf4e25635888dabff1790f496fc07a408f337e1e4e844a8d57da1e9920888" exitCode=0 Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.401417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" event={"ID":"6fca34cb-0c72-422c-86e9-638584bb9dcb","Type":"ContainerDied","Data":"8facf4e25635888dabff1790f496fc07a408f337e1e4e844a8d57da1e9920888"} Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.404634 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerStarted","Data":"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3"} Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.424970 4795 scope.go:117] "RemoveContainer" containerID="8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.442973 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nxgk4" podStartSLOduration=2.913359959 podStartE2EDuration="10.442947548s" podCreationTimestamp="2025-12-05 09:11:47 +0000 UTC" firstStartedPulling="2025-12-05 09:11:49.289937276 +0000 UTC m=+2860.862541025" lastFinishedPulling="2025-12-05 09:11:56.819524875 +0000 UTC m=+2868.392128614" observedRunningTime="2025-12-05 09:11:57.4396701 +0000 UTC m=+2869.012273869" watchObservedRunningTime="2025-12-05 09:11:57.442947548 +0000 UTC m=+2869.015551307" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.487317 4795 scope.go:117] "RemoveContainer" containerID="168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.530561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.557705 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xr6z8"] Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.558599 4795 scope.go:117] "RemoveContainer" containerID="996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef" Dec 05 09:11:57 crc kubenswrapper[4795]: E1205 09:11:57.559253 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef\": container with ID starting with 996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef not found: ID does not exist" containerID="996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.559286 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef"} err="failed to get container status \"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef\": rpc error: code = NotFound desc = could not find container \"996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef\": container with ID starting with 996f0c7134575c5f22ac211d1dd1dcf0e337a6a7fc1a284db8630bbea8476fef not found: ID does not exist" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.559316 4795 scope.go:117] "RemoveContainer" containerID="8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704" Dec 05 09:11:57 crc kubenswrapper[4795]: E1205 09:11:57.559779 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704\": container with ID starting with 8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704 not found: ID does not exist" containerID="8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.559891 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704"} err="failed to get container status \"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704\": rpc error: code = NotFound desc = could not find container \"8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704\": container with ID starting with 8e97ea4beb448c68151e977a3e7d6adbaa00dde4c6f053b60d210434278ff704 not found: ID does not exist" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.559965 4795 scope.go:117] "RemoveContainer" containerID="168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18" Dec 05 09:11:57 crc kubenswrapper[4795]: E1205 09:11:57.560286 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18\": container with ID starting with 168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18 not found: ID does not exist" containerID="168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18" Dec 05 09:11:57 crc kubenswrapper[4795]: I1205 09:11:57.560319 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18"} err="failed to get container status \"168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18\": rpc error: code = NotFound desc = could not find container \"168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18\": container with ID starting with 168efddac928e20e2201c268fdfd6838cbf3250aed1bf8d927a8fc7e1e028e18 not found: ID does not exist" Dec 05 09:11:58 crc kubenswrapper[4795]: I1205 09:11:58.129533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:58 crc kubenswrapper[4795]: I1205 09:11:58.129766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:11:58 crc kubenswrapper[4795]: I1205 09:11:58.783054 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee45036-89e0-4101-9463-524e1bcb4486" path="/var/lib/kubelet/pods/cee45036-89e0-4101-9463-524e1bcb4486/volumes" Dec 05 09:11:58 crc kubenswrapper[4795]: I1205 09:11:58.945845 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093337 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093419 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093442 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093501 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093525 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093571 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9zg\" (UniqueName: \"kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093650 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.093683 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0\") pod \"6fca34cb-0c72-422c-86e9-638584bb9dcb\" (UID: \"6fca34cb-0c72-422c-86e9-638584bb9dcb\") " Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.101577 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg" (OuterVolumeSpecName: "kube-api-access-vf9zg") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "kube-api-access-vf9zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.137940 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.165716 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.193071 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nxgk4" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="registry-server" probeResult="failure" output=< Dec 05 09:11:59 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:11:59 crc kubenswrapper[4795]: > Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.197174 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.197210 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf9zg\" (UniqueName: \"kubernetes.io/projected/6fca34cb-0c72-422c-86e9-638584bb9dcb-kube-api-access-vf9zg\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.197220 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.206767 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.207762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.221105 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.230843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.231440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.241310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory" (OuterVolumeSpecName: "inventory") pod "6fca34cb-0c72-422c-86e9-638584bb9dcb" (UID: "6fca34cb-0c72-422c-86e9-638584bb9dcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299218 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299259 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299272 4795 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299281 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299291 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.299302 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6fca34cb-0c72-422c-86e9-638584bb9dcb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.430105 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.430648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-m5khq" event={"ID":"6fca34cb-0c72-422c-86e9-638584bb9dcb","Type":"ContainerDied","Data":"c922b6a190b1a8dab734501d7effaa307463e5963982d8620d86478109aa1666"} Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.430692 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c922b6a190b1a8dab734501d7effaa307463e5963982d8620d86478109aa1666" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.649647 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6"] Dec 05 09:11:59 crc kubenswrapper[4795]: E1205 09:11:59.650113 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="registry-server" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650130 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="registry-server" Dec 05 09:11:59 crc kubenswrapper[4795]: E1205 09:11:59.650141 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="extract-utilities" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650149 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="extract-utilities" Dec 05 09:11:59 crc kubenswrapper[4795]: E1205 09:11:59.650160 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fca34cb-0c72-422c-86e9-638584bb9dcb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650167 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fca34cb-0c72-422c-86e9-638584bb9dcb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 09:11:59 crc kubenswrapper[4795]: E1205 09:11:59.650208 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="extract-content" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650214 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="extract-content" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650396 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee45036-89e0-4101-9463-524e1bcb4486" containerName="registry-server" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.650420 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fca34cb-0c72-422c-86e9-638584bb9dcb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.651133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.655388 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.655413 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.656013 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-4rnp8" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.662883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.663100 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.675549 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6"] Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.809496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.809975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.810073 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.810176 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.810215 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.810277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vg4\" (UniqueName: \"kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.810325 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912211 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vg4\" (UniqueName: \"kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912530 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.912571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.920193 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.921186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.922596 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.931504 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.931731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.933488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:11:59 crc kubenswrapper[4795]: I1205 09:11:59.934510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vg4\" (UniqueName: \"kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:12:00 crc kubenswrapper[4795]: I1205 09:12:00.000393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:12:00 crc kubenswrapper[4795]: I1205 09:12:00.627189 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6"] Dec 05 09:12:01 crc kubenswrapper[4795]: I1205 09:12:01.464373 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" event={"ID":"594805cd-d62b-47e5-9ad8-1c423b5fcebd","Type":"ContainerStarted","Data":"06f5ad6c913a814286c3c7d8b0f493d4cfc95e959d1f2f1e1a0c7d21a3595bd5"} Dec 05 09:12:02 crc kubenswrapper[4795]: I1205 09:12:02.477150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" event={"ID":"594805cd-d62b-47e5-9ad8-1c423b5fcebd","Type":"ContainerStarted","Data":"34e28c8b6ebadeb4624c0cf05fbba2ca15d0efc5498559cdc3781b49d116b69f"} Dec 05 09:12:03 crc kubenswrapper[4795]: I1205 09:12:03.510495 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" podStartSLOduration=3.170742432 podStartE2EDuration="4.510462154s" podCreationTimestamp="2025-12-05 09:11:59 +0000 UTC" firstStartedPulling="2025-12-05 09:12:00.633605368 +0000 UTC m=+2872.206209107" lastFinishedPulling="2025-12-05 09:12:01.97332509 +0000 UTC m=+2873.545928829" observedRunningTime="2025-12-05 09:12:03.504962375 +0000 UTC m=+2875.077566114" watchObservedRunningTime="2025-12-05 09:12:03.510462154 +0000 UTC m=+2875.083065903" Dec 05 09:12:08 crc kubenswrapper[4795]: I1205 09:12:08.185384 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:12:08 crc kubenswrapper[4795]: I1205 09:12:08.243876 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:12:08 crc kubenswrapper[4795]: I1205 09:12:08.429767 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:12:09 crc kubenswrapper[4795]: I1205 09:12:09.580399 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nxgk4" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="registry-server" containerID="cri-o://c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3" gracePeriod=2 Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.111594 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.254225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities\") pod \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.254313 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content\") pod \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.254873 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdkf5\" (UniqueName: \"kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5\") pod \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\" (UID: \"f54f11c1-a4c0-456a-a4d4-c55480a00e60\") " Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.255418 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities" (OuterVolumeSpecName: "utilities") pod "f54f11c1-a4c0-456a-a4d4-c55480a00e60" (UID: "f54f11c1-a4c0-456a-a4d4-c55480a00e60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.255914 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.263099 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5" (OuterVolumeSpecName: "kube-api-access-sdkf5") pod "f54f11c1-a4c0-456a-a4d4-c55480a00e60" (UID: "f54f11c1-a4c0-456a-a4d4-c55480a00e60"). InnerVolumeSpecName "kube-api-access-sdkf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.358114 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdkf5\" (UniqueName: \"kubernetes.io/projected/f54f11c1-a4c0-456a-a4d4-c55480a00e60-kube-api-access-sdkf5\") on node \"crc\" DevicePath \"\"" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.379951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f54f11c1-a4c0-456a-a4d4-c55480a00e60" (UID: "f54f11c1-a4c0-456a-a4d4-c55480a00e60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.460178 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f54f11c1-a4c0-456a-a4d4-c55480a00e60-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.636327 4795 generic.go:334] "Generic (PLEG): container finished" podID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerID="c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3" exitCode=0 Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.636389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerDied","Data":"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3"} Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.636426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nxgk4" event={"ID":"f54f11c1-a4c0-456a-a4d4-c55480a00e60","Type":"ContainerDied","Data":"5ac6f5597c15fe12e8ce26e0a1a5c03d25a289f4fa580bcd621a8275436f94b9"} Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.636471 4795 scope.go:117] "RemoveContainer" containerID="c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.636750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nxgk4" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.700795 4795 scope.go:117] "RemoveContainer" containerID="d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.713074 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.734596 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nxgk4"] Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.738850 4795 scope.go:117] "RemoveContainer" containerID="96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.760367 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" path="/var/lib/kubelet/pods/f54f11c1-a4c0-456a-a4d4-c55480a00e60/volumes" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.788852 4795 scope.go:117] "RemoveContainer" containerID="c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3" Dec 05 09:12:10 crc kubenswrapper[4795]: E1205 09:12:10.789256 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3\": container with ID starting with c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3 not found: ID does not exist" containerID="c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.789296 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3"} err="failed to get container status \"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3\": rpc error: code = NotFound desc = could not find container \"c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3\": container with ID starting with c019900b0bac4f8f80bee8829b338c09cde5dcd06739091d0c86085c62d569e3 not found: ID does not exist" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.789325 4795 scope.go:117] "RemoveContainer" containerID="d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39" Dec 05 09:12:10 crc kubenswrapper[4795]: E1205 09:12:10.789637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39\": container with ID starting with d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39 not found: ID does not exist" containerID="d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.789860 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39"} err="failed to get container status \"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39\": rpc error: code = NotFound desc = could not find container \"d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39\": container with ID starting with d4ce4be23cdc17d0d8c34c402920f7bc46352e44d1d596d17ae74affe59ccf39 not found: ID does not exist" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.789878 4795 scope.go:117] "RemoveContainer" containerID="96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b" Dec 05 09:12:10 crc kubenswrapper[4795]: E1205 09:12:10.790100 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b\": container with ID starting with 96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b not found: ID does not exist" containerID="96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.790120 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b"} err="failed to get container status \"96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b\": rpc error: code = NotFound desc = could not find container \"96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b\": container with ID starting with 96e759c8218c9d7e565324a2290625a470b20f8004bf046363917a21f5cc374b not found: ID does not exist" Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.826547 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:12:10 crc kubenswrapper[4795]: I1205 09:12:10.826640 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:12:10 crc kubenswrapper[4795]: E1205 09:12:10.846291 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf54f11c1_a4c0_456a_a4d4_c55480a00e60.slice\": RecentStats: unable to find data in memory cache]" Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.826598 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.827496 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.827560 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.828086 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.828138 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8" gracePeriod=600 Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.983256 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8" exitCode=0 Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.983328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8"} Dec 05 09:12:40 crc kubenswrapper[4795]: I1205 09:12:40.983378 4795 scope.go:117] "RemoveContainer" containerID="bf6983f43c4b84d0e617dd5cf385cf6233e589cac425e1671cfa3e99ac80dde5" Dec 05 09:12:42 crc kubenswrapper[4795]: I1205 09:12:42.000124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64"} Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.161308 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf"] Dec 05 09:15:00 crc kubenswrapper[4795]: E1205 09:15:00.162560 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="extract-utilities" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.162577 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="extract-utilities" Dec 05 09:15:00 crc kubenswrapper[4795]: E1205 09:15:00.162597 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="registry-server" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.162604 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="registry-server" Dec 05 09:15:00 crc kubenswrapper[4795]: E1205 09:15:00.162637 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="extract-content" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.162645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="extract-content" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.162871 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54f11c1-a4c0-456a-a4d4-c55480a00e60" containerName="registry-server" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.163707 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.167942 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.181363 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.183686 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf"] Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.272216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzl7r\" (UniqueName: \"kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.272307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.272392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.376436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.377206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzl7r\" (UniqueName: \"kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.377273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.387817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.422303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzl7r\" (UniqueName: \"kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.433362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume\") pod \"collect-profiles-29415435-jztdf\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:00 crc kubenswrapper[4795]: I1205 09:15:00.488343 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:01 crc kubenswrapper[4795]: I1205 09:15:01.043913 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf"] Dec 05 09:15:01 crc kubenswrapper[4795]: I1205 09:15:01.517002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" event={"ID":"9b675ff9-bacc-46ea-a03d-5872e4ba173c","Type":"ContainerStarted","Data":"fdc0b3b270a490098118e9c58f4b0f531b3a657b60d7b4cd84bf0910a30887ec"} Dec 05 09:15:01 crc kubenswrapper[4795]: I1205 09:15:01.517496 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" event={"ID":"9b675ff9-bacc-46ea-a03d-5872e4ba173c","Type":"ContainerStarted","Data":"aa1a85af0e12dc6074da324710fd66c28a3c97859d990829e069fd9d3c74c3c2"} Dec 05 09:15:01 crc kubenswrapper[4795]: I1205 09:15:01.553442 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" podStartSLOduration=1.553409644 podStartE2EDuration="1.553409644s" podCreationTimestamp="2025-12-05 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 09:15:01.541939494 +0000 UTC m=+3053.114543233" watchObservedRunningTime="2025-12-05 09:15:01.553409644 +0000 UTC m=+3053.126013383" Dec 05 09:15:02 crc kubenswrapper[4795]: I1205 09:15:02.531569 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b675ff9-bacc-46ea-a03d-5872e4ba173c" containerID="fdc0b3b270a490098118e9c58f4b0f531b3a657b60d7b4cd84bf0910a30887ec" exitCode=0 Dec 05 09:15:02 crc kubenswrapper[4795]: I1205 09:15:02.531921 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" event={"ID":"9b675ff9-bacc-46ea-a03d-5872e4ba173c","Type":"ContainerDied","Data":"fdc0b3b270a490098118e9c58f4b0f531b3a657b60d7b4cd84bf0910a30887ec"} Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.898043 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.981586 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzl7r\" (UniqueName: \"kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r\") pod \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.982169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume\") pod \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.982370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume\") pod \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\" (UID: \"9b675ff9-bacc-46ea-a03d-5872e4ba173c\") " Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.983783 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b675ff9-bacc-46ea-a03d-5872e4ba173c" (UID: "9b675ff9-bacc-46ea-a03d-5872e4ba173c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.991159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b675ff9-bacc-46ea-a03d-5872e4ba173c" (UID: "9b675ff9-bacc-46ea-a03d-5872e4ba173c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:15:03 crc kubenswrapper[4795]: I1205 09:15:03.996350 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r" (OuterVolumeSpecName: "kube-api-access-bzl7r") pod "9b675ff9-bacc-46ea-a03d-5872e4ba173c" (UID: "9b675ff9-bacc-46ea-a03d-5872e4ba173c"). InnerVolumeSpecName "kube-api-access-bzl7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.085718 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b675ff9-bacc-46ea-a03d-5872e4ba173c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.085762 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzl7r\" (UniqueName: \"kubernetes.io/projected/9b675ff9-bacc-46ea-a03d-5872e4ba173c-kube-api-access-bzl7r\") on node \"crc\" DevicePath \"\"" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.085772 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b675ff9-bacc-46ea-a03d-5872e4ba173c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.557349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" event={"ID":"9b675ff9-bacc-46ea-a03d-5872e4ba173c","Type":"ContainerDied","Data":"aa1a85af0e12dc6074da324710fd66c28a3c97859d990829e069fd9d3c74c3c2"} Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.557419 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa1a85af0e12dc6074da324710fd66c28a3c97859d990829e069fd9d3c74c3c2" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.557501 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf" Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.669522 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm"] Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.683943 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415390-gv4vm"] Dec 05 09:15:04 crc kubenswrapper[4795]: I1205 09:15:04.778776 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde" path="/var/lib/kubelet/pods/dd3f0c2e-5a4b-4d82-8540-6ca0ce162cde/volumes" Dec 05 09:15:10 crc kubenswrapper[4795]: I1205 09:15:10.827741 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:15:10 crc kubenswrapper[4795]: I1205 09:15:10.828692 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:15:40 crc kubenswrapper[4795]: I1205 09:15:40.826889 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:15:40 crc kubenswrapper[4795]: I1205 09:15:40.827649 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:15:53 crc kubenswrapper[4795]: I1205 09:15:53.433686 4795 scope.go:117] "RemoveContainer" containerID="400e3a6765e627751435991cac890e46499f407d086d62de6b9c83223d438cd7" Dec 05 09:16:04 crc kubenswrapper[4795]: I1205 09:16:04.188951 4795 generic.go:334] "Generic (PLEG): container finished" podID="594805cd-d62b-47e5-9ad8-1c423b5fcebd" containerID="34e28c8b6ebadeb4624c0cf05fbba2ca15d0efc5498559cdc3781b49d116b69f" exitCode=0 Dec 05 09:16:04 crc kubenswrapper[4795]: I1205 09:16:04.189735 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" event={"ID":"594805cd-d62b-47e5-9ad8-1c423b5fcebd","Type":"ContainerDied","Data":"34e28c8b6ebadeb4624c0cf05fbba2ca15d0efc5498559cdc3781b49d116b69f"} Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.719092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.913908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59vg4\" (UniqueName: \"kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.914286 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1\") pod \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\" (UID: \"594805cd-d62b-47e5-9ad8-1c423b5fcebd\") " Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.929112 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4" (OuterVolumeSpecName: "kube-api-access-59vg4") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "kube-api-access-59vg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.949938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.950934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.956189 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory" (OuterVolumeSpecName: "inventory") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.962086 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.965712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:05 crc kubenswrapper[4795]: I1205 09:16:05.970597 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "594805cd-d62b-47e5-9ad8-1c423b5fcebd" (UID: "594805cd-d62b-47e5-9ad8-1c423b5fcebd"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016114 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016170 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016192 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-inventory\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016206 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016220 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59vg4\" (UniqueName: \"kubernetes.io/projected/594805cd-d62b-47e5-9ad8-1c423b5fcebd-kube-api-access-59vg4\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016231 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.016242 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/594805cd-d62b-47e5-9ad8-1c423b5fcebd-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.211877 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" event={"ID":"594805cd-d62b-47e5-9ad8-1c423b5fcebd","Type":"ContainerDied","Data":"06f5ad6c913a814286c3c7d8b0f493d4cfc95e959d1f2f1e1a0c7d21a3595bd5"} Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.212245 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f5ad6c913a814286c3c7d8b0f493d4cfc95e959d1f2f1e1a0c7d21a3595bd5" Dec 05 09:16:06 crc kubenswrapper[4795]: I1205 09:16:06.211943 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6" Dec 05 09:16:10 crc kubenswrapper[4795]: I1205 09:16:10.826822 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:16:10 crc kubenswrapper[4795]: I1205 09:16:10.827767 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:16:10 crc kubenswrapper[4795]: I1205 09:16:10.827838 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:16:10 crc kubenswrapper[4795]: I1205 09:16:10.828907 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:16:10 crc kubenswrapper[4795]: I1205 09:16:10.828974 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" gracePeriod=600 Dec 05 09:16:10 crc kubenswrapper[4795]: E1205 09:16:10.967063 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:16:11 crc kubenswrapper[4795]: I1205 09:16:11.271062 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" exitCode=0 Dec 05 09:16:11 crc kubenswrapper[4795]: I1205 09:16:11.271129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64"} Dec 05 09:16:11 crc kubenswrapper[4795]: I1205 09:16:11.271217 4795 scope.go:117] "RemoveContainer" containerID="c28c21c58cbde5dcd526aaa1c5dc6288bc2bfd298ef9e3bbe818c729d69598f8" Dec 05 09:16:11 crc kubenswrapper[4795]: I1205 09:16:11.272092 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:16:11 crc kubenswrapper[4795]: E1205 09:16:11.272675 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:16:23 crc kubenswrapper[4795]: I1205 09:16:23.748036 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:16:23 crc kubenswrapper[4795]: E1205 09:16:23.749077 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:16:36 crc kubenswrapper[4795]: I1205 09:16:36.748022 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:16:36 crc kubenswrapper[4795]: E1205 09:16:36.750025 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:16:51 crc kubenswrapper[4795]: I1205 09:16:51.747887 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:16:51 crc kubenswrapper[4795]: E1205 09:16:51.748463 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:17:04 crc kubenswrapper[4795]: I1205 09:17:04.748742 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:17:04 crc kubenswrapper[4795]: E1205 09:17:04.749801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.622564 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 09:17:11 crc kubenswrapper[4795]: E1205 09:17:11.623761 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b675ff9-bacc-46ea-a03d-5872e4ba173c" containerName="collect-profiles" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.623777 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b675ff9-bacc-46ea-a03d-5872e4ba173c" containerName="collect-profiles" Dec 05 09:17:11 crc kubenswrapper[4795]: E1205 09:17:11.623802 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594805cd-d62b-47e5-9ad8-1c423b5fcebd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.623810 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="594805cd-d62b-47e5-9ad8-1c423b5fcebd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.624019 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="594805cd-d62b-47e5-9ad8-1c423b5fcebd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.624030 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b675ff9-bacc-46ea-a03d-5872e4ba173c" containerName="collect-profiles" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.626483 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.636409 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.639257 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.639593 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.639830 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.640009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-48rcz" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.795286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.795824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.795861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.795948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.795969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.796001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dh4w\" (UniqueName: \"kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.796020 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.796040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.796093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.897809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.897874 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.897926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dh4w\" (UniqueName: \"kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.897946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.897966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.898035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.898111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.898129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.898153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.898935 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.899239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.899539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.900075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.900783 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.904958 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.906184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.925572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dh4w\" (UniqueName: \"kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.926514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.954749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " pod="openstack/tempest-tests-tempest" Dec 05 09:17:11 crc kubenswrapper[4795]: I1205 09:17:11.998697 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 09:17:12 crc kubenswrapper[4795]: I1205 09:17:12.525282 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 05 09:17:12 crc kubenswrapper[4795]: I1205 09:17:12.529729 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:17:12 crc kubenswrapper[4795]: I1205 09:17:12.942785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7f223c39-817f-4bac-9c3b-490359a0e44d","Type":"ContainerStarted","Data":"29047ab2fc5889f5dd0ced6ddb0192201e9f995cdd002ea836bf75cd62b82a38"} Dec 05 09:17:15 crc kubenswrapper[4795]: I1205 09:17:15.747698 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:17:15 crc kubenswrapper[4795]: E1205 09:17:15.748496 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:17:18 crc kubenswrapper[4795]: I1205 09:17:18.316904 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-tkk5l" podUID="63be1623-e1cd-4904-99cb-9497a6596599" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.79:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:17:18 crc kubenswrapper[4795]: I1205 09:17:18.591029 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-m6snn" podUID="e8acf865-8373-4a37-ba22-bc276e596f2d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:17:26 crc kubenswrapper[4795]: I1205 09:17:26.756312 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:17:26 crc kubenswrapper[4795]: E1205 09:17:26.759127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:17:39 crc kubenswrapper[4795]: I1205 09:17:39.747774 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:17:39 crc kubenswrapper[4795]: E1205 09:17:39.748819 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:17:43 crc kubenswrapper[4795]: I1205 09:17:43.691686 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:17:43 crc kubenswrapper[4795]: I1205 09:17:43.691889 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:17:49 crc kubenswrapper[4795]: I1205 09:17:49.905859 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" podUID="2297e0a2-10ff-47d9-8acf-c94bf4bddc9f" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.48:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:17:54 crc kubenswrapper[4795]: I1205 09:17:54.747918 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:17:54 crc kubenswrapper[4795]: E1205 09:17:54.748753 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:18:08 crc kubenswrapper[4795]: I1205 09:18:08.762494 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:18:08 crc kubenswrapper[4795]: E1205 09:18:08.763646 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:18:26 crc kubenswrapper[4795]: I1205 09:18:23.747517 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:18:26 crc kubenswrapper[4795]: E1205 09:18:23.748562 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:18:33 crc kubenswrapper[4795]: E1205 09:18:33.295588 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 05 09:18:33 crc kubenswrapper[4795]: E1205 09:18:33.299303 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dh4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(7f223c39-817f-4bac-9c3b-490359a0e44d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 09:18:33 crc kubenswrapper[4795]: E1205 09:18:33.300884 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="7f223c39-817f-4bac-9c3b-490359a0e44d" Dec 05 09:18:33 crc kubenswrapper[4795]: E1205 09:18:33.665007 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="7f223c39-817f-4bac-9c3b-490359a0e44d" Dec 05 09:18:34 crc kubenswrapper[4795]: I1205 09:18:34.748311 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:18:34 crc kubenswrapper[4795]: E1205 09:18:34.748781 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:18:47 crc kubenswrapper[4795]: I1205 09:18:47.455526 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 05 09:18:49 crc kubenswrapper[4795]: I1205 09:18:49.748576 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:18:49 crc kubenswrapper[4795]: E1205 09:18:49.749582 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:18:49 crc kubenswrapper[4795]: I1205 09:18:49.891755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7f223c39-817f-4bac-9c3b-490359a0e44d","Type":"ContainerStarted","Data":"6853b0e9af0634724323ea374ccf54d5da7cfca3f842ddce91b07a12efa15064"} Dec 05 09:18:49 crc kubenswrapper[4795]: I1205 09:18:49.915876 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.994886268 podStartE2EDuration="1m39.915852409s" podCreationTimestamp="2025-12-05 09:17:10 +0000 UTC" firstStartedPulling="2025-12-05 09:17:12.529363878 +0000 UTC m=+3184.101967617" lastFinishedPulling="2025-12-05 09:18:47.450330019 +0000 UTC m=+3279.022933758" observedRunningTime="2025-12-05 09:18:49.910277648 +0000 UTC m=+3281.482881387" watchObservedRunningTime="2025-12-05 09:18:49.915852409 +0000 UTC m=+3281.488456158" Dec 05 09:19:04 crc kubenswrapper[4795]: I1205 09:19:04.753254 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:19:04 crc kubenswrapper[4795]: E1205 09:19:04.754446 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:19:15 crc kubenswrapper[4795]: I1205 09:19:15.748969 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:19:15 crc kubenswrapper[4795]: E1205 09:19:15.749981 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:19:30 crc kubenswrapper[4795]: I1205 09:19:30.748297 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:19:30 crc kubenswrapper[4795]: E1205 09:19:30.749652 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.035649 4795 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.02354349s: [/var/lib/containers/storage/overlay/5dc2971359ea1d56ea362a67559d9d672f30826ead602137d6ba251d38137749/diff /var/log/pods/openstack_openstack-galera-0_e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6/galera/0.log]; will not log again for this container unless duration exceeds 2s Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.739958 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.744050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.844170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvwpb\" (UniqueName: \"kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.844287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.844483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.863055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.947818 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.947950 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvwpb\" (UniqueName: \"kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.948016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.949022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.949183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:37 crc kubenswrapper[4795]: I1205 09:19:37.990883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvwpb\" (UniqueName: \"kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb\") pod \"community-operators-wmd24\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:38 crc kubenswrapper[4795]: I1205 09:19:38.078392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:39 crc kubenswrapper[4795]: I1205 09:19:39.377594 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:19:39 crc kubenswrapper[4795]: I1205 09:19:39.493421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerStarted","Data":"0f8d9fe290f4a260a70132e4b4f9b81ef16e3288657b2b3a6624b3be580c5b30"} Dec 05 09:19:40 crc kubenswrapper[4795]: I1205 09:19:40.507702 4795 generic.go:334] "Generic (PLEG): container finished" podID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerID="2301fec7fbd59c8ac1fb90b894140d0a38480106a557b11101a97171c18ad96e" exitCode=0 Dec 05 09:19:40 crc kubenswrapper[4795]: I1205 09:19:40.507786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerDied","Data":"2301fec7fbd59c8ac1fb90b894140d0a38480106a557b11101a97171c18ad96e"} Dec 05 09:19:43 crc kubenswrapper[4795]: I1205 09:19:43.593745 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerStarted","Data":"ab8a0f261abc61cb26b48e1581377308c0528fc36d6e46f0c325f1be52a81cc0"} Dec 05 09:19:45 crc kubenswrapper[4795]: I1205 09:19:45.763244 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:19:45 crc kubenswrapper[4795]: E1205 09:19:45.764679 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:19:49 crc kubenswrapper[4795]: I1205 09:19:49.691861 4795 generic.go:334] "Generic (PLEG): container finished" podID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerID="ab8a0f261abc61cb26b48e1581377308c0528fc36d6e46f0c325f1be52a81cc0" exitCode=0 Dec 05 09:19:49 crc kubenswrapper[4795]: I1205 09:19:49.691998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerDied","Data":"ab8a0f261abc61cb26b48e1581377308c0528fc36d6e46f0c325f1be52a81cc0"} Dec 05 09:19:50 crc kubenswrapper[4795]: I1205 09:19:50.710065 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerStarted","Data":"65fa5fc00d38940f9ba660e8d9ed7175bbea0c525cb383eb51a93aad9de126c9"} Dec 05 09:19:50 crc kubenswrapper[4795]: I1205 09:19:50.745061 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmd24" podStartSLOduration=5.241217541 podStartE2EDuration="13.745024609s" podCreationTimestamp="2025-12-05 09:19:37 +0000 UTC" firstStartedPulling="2025-12-05 09:19:41.57756128 +0000 UTC m=+3333.150165009" lastFinishedPulling="2025-12-05 09:19:50.081368338 +0000 UTC m=+3341.653972077" observedRunningTime="2025-12-05 09:19:50.734686181 +0000 UTC m=+3342.307289930" watchObservedRunningTime="2025-12-05 09:19:50.745024609 +0000 UTC m=+3342.317628348" Dec 05 09:19:58 crc kubenswrapper[4795]: I1205 09:19:58.079432 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:58 crc kubenswrapper[4795]: I1205 09:19:58.080222 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:19:58 crc kubenswrapper[4795]: I1205 09:19:58.756834 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:19:58 crc kubenswrapper[4795]: E1205 09:19:58.757247 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:19:58 crc kubenswrapper[4795]: I1205 09:19:58.963731 4795 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:19:58 crc kubenswrapper[4795]: I1205 09:19:58.964071 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 09:19:59 crc kubenswrapper[4795]: I1205 09:19:59.140093 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wmd24" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="registry-server" probeResult="failure" output=< Dec 05 09:19:59 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:19:59 crc kubenswrapper[4795]: > Dec 05 09:20:08 crc kubenswrapper[4795]: I1205 09:20:08.158281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:20:08 crc kubenswrapper[4795]: I1205 09:20:08.289533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:20:08 crc kubenswrapper[4795]: I1205 09:20:08.940687 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:20:09 crc kubenswrapper[4795]: I1205 09:20:09.914011 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmd24" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="registry-server" containerID="cri-o://65fa5fc00d38940f9ba660e8d9ed7175bbea0c525cb383eb51a93aad9de126c9" gracePeriod=2 Dec 05 09:20:10 crc kubenswrapper[4795]: I1205 09:20:10.925297 4795 generic.go:334] "Generic (PLEG): container finished" podID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerID="65fa5fc00d38940f9ba660e8d9ed7175bbea0c525cb383eb51a93aad9de126c9" exitCode=0 Dec 05 09:20:10 crc kubenswrapper[4795]: I1205 09:20:10.925659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerDied","Data":"65fa5fc00d38940f9ba660e8d9ed7175bbea0c525cb383eb51a93aad9de126c9"} Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.724095 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.749940 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:20:11 crc kubenswrapper[4795]: E1205 09:20:11.750291 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.851580 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities\") pod \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.851700 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content\") pod \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.851861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvwpb\" (UniqueName: \"kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb\") pod \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\" (UID: \"38315ddb-365c-40aa-99ad-29d3c46cbbd5\") " Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.852787 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities" (OuterVolumeSpecName: "utilities") pod "38315ddb-365c-40aa-99ad-29d3c46cbbd5" (UID: "38315ddb-365c-40aa-99ad-29d3c46cbbd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.864057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb" (OuterVolumeSpecName: "kube-api-access-wvwpb") pod "38315ddb-365c-40aa-99ad-29d3c46cbbd5" (UID: "38315ddb-365c-40aa-99ad-29d3c46cbbd5"). InnerVolumeSpecName "kube-api-access-wvwpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.922108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38315ddb-365c-40aa-99ad-29d3c46cbbd5" (UID: "38315ddb-365c-40aa-99ad-29d3c46cbbd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.938906 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmd24" event={"ID":"38315ddb-365c-40aa-99ad-29d3c46cbbd5","Type":"ContainerDied","Data":"0f8d9fe290f4a260a70132e4b4f9b81ef16e3288657b2b3a6624b3be580c5b30"} Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.939915 4795 scope.go:117] "RemoveContainer" containerID="65fa5fc00d38940f9ba660e8d9ed7175bbea0c525cb383eb51a93aad9de126c9" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.939258 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmd24" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.955273 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.955605 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38315ddb-365c-40aa-99ad-29d3c46cbbd5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.955759 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvwpb\" (UniqueName: \"kubernetes.io/projected/38315ddb-365c-40aa-99ad-29d3c46cbbd5-kube-api-access-wvwpb\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:11 crc kubenswrapper[4795]: I1205 09:20:11.974905 4795 scope.go:117] "RemoveContainer" containerID="ab8a0f261abc61cb26b48e1581377308c0528fc36d6e46f0c325f1be52a81cc0" Dec 05 09:20:12 crc kubenswrapper[4795]: I1205 09:20:12.028523 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:20:12 crc kubenswrapper[4795]: I1205 09:20:12.029013 4795 scope.go:117] "RemoveContainer" containerID="2301fec7fbd59c8ac1fb90b894140d0a38480106a557b11101a97171c18ad96e" Dec 05 09:20:12 crc kubenswrapper[4795]: I1205 09:20:12.071426 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmd24"] Dec 05 09:20:12 crc kubenswrapper[4795]: I1205 09:20:12.760291 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" path="/var/lib/kubelet/pods/38315ddb-365c-40aa-99ad-29d3c46cbbd5/volumes" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.386482 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:21 crc kubenswrapper[4795]: E1205 09:20:21.387895 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="extract-content" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.387918 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="extract-content" Dec 05 09:20:21 crc kubenswrapper[4795]: E1205 09:20:21.387932 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="extract-utilities" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.387941 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="extract-utilities" Dec 05 09:20:21 crc kubenswrapper[4795]: E1205 09:20:21.387957 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="registry-server" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.387965 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="registry-server" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.388246 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38315ddb-365c-40aa-99ad-29d3c46cbbd5" containerName="registry-server" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.390107 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.399461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.531813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.532043 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.532196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znn4c\" (UniqueName: \"kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.633949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.634529 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znn4c\" (UniqueName: \"kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.634719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.634931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.635312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.663013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znn4c\" (UniqueName: \"kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c\") pod \"redhat-marketplace-792qz\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:21 crc kubenswrapper[4795]: I1205 09:20:21.734505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:22 crc kubenswrapper[4795]: I1205 09:20:22.465073 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:22 crc kubenswrapper[4795]: I1205 09:20:22.748949 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:20:22 crc kubenswrapper[4795]: E1205 09:20:22.749792 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:20:23 crc kubenswrapper[4795]: I1205 09:20:23.072266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerStarted","Data":"cbd45d9f13a1a3802bac366bed6fbbcf5bd40bdce0bbec04f16b886d2e8f5113"} Dec 05 09:20:23 crc kubenswrapper[4795]: I1205 09:20:23.072342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerStarted","Data":"e63432b307c2e749e7810b5ce84588dc9ea1263586d309a45f604b5c52e637a2"} Dec 05 09:20:24 crc kubenswrapper[4795]: I1205 09:20:24.087351 4795 generic.go:334] "Generic (PLEG): container finished" podID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerID="cbd45d9f13a1a3802bac366bed6fbbcf5bd40bdce0bbec04f16b886d2e8f5113" exitCode=0 Dec 05 09:20:24 crc kubenswrapper[4795]: I1205 09:20:24.087618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerDied","Data":"cbd45d9f13a1a3802bac366bed6fbbcf5bd40bdce0bbec04f16b886d2e8f5113"} Dec 05 09:20:25 crc kubenswrapper[4795]: E1205 09:20:25.741855 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f3bcbc_c13d_49ab_8696_8e416cabcf74.slice/crio-conmon-55bc5f30c4c40e4f58a4498919c8a488b367b006619204165bfbae36b3f7064b.scope\": RecentStats: unable to find data in memory cache]" Dec 05 09:20:26 crc kubenswrapper[4795]: I1205 09:20:26.112947 4795 generic.go:334] "Generic (PLEG): container finished" podID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerID="55bc5f30c4c40e4f58a4498919c8a488b367b006619204165bfbae36b3f7064b" exitCode=0 Dec 05 09:20:26 crc kubenswrapper[4795]: I1205 09:20:26.113003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerDied","Data":"55bc5f30c4c40e4f58a4498919c8a488b367b006619204165bfbae36b3f7064b"} Dec 05 09:20:28 crc kubenswrapper[4795]: I1205 09:20:28.136479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerStarted","Data":"7cd65025ec15d1982ee3c525bdb9a8836a4d4c5d5d7a7e69de49150a82c07d6e"} Dec 05 09:20:29 crc kubenswrapper[4795]: I1205 09:20:29.176573 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-792qz" podStartSLOduration=5.569073843 podStartE2EDuration="8.176547127s" podCreationTimestamp="2025-12-05 09:20:21 +0000 UTC" firstStartedPulling="2025-12-05 09:20:24.090360937 +0000 UTC m=+3375.662964666" lastFinishedPulling="2025-12-05 09:20:26.697834211 +0000 UTC m=+3378.270437950" observedRunningTime="2025-12-05 09:20:29.165969301 +0000 UTC m=+3380.738573040" watchObservedRunningTime="2025-12-05 09:20:29.176547127 +0000 UTC m=+3380.749150866" Dec 05 09:20:30 crc kubenswrapper[4795]: I1205 09:20:30.492116 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-9m78n" podUID="e1652139-9bce-404b-a089-375e6023dc34" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:20:31 crc kubenswrapper[4795]: I1205 09:20:31.735313 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:31 crc kubenswrapper[4795]: I1205 09:20:31.736305 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:31 crc kubenswrapper[4795]: I1205 09:20:31.792189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:32 crc kubenswrapper[4795]: I1205 09:20:32.231398 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:32 crc kubenswrapper[4795]: I1205 09:20:32.324853 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:33 crc kubenswrapper[4795]: I1205 09:20:33.693863 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:20:33 crc kubenswrapper[4795]: I1205 09:20:33.693863 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:20:33 crc kubenswrapper[4795]: I1205 09:20:33.751450 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:20:33 crc kubenswrapper[4795]: E1205 09:20:33.752015 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:20:34 crc kubenswrapper[4795]: I1205 09:20:34.198975 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-792qz" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="registry-server" containerID="cri-o://7cd65025ec15d1982ee3c525bdb9a8836a4d4c5d5d7a7e69de49150a82c07d6e" gracePeriod=2 Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.217103 4795 generic.go:334] "Generic (PLEG): container finished" podID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerID="7cd65025ec15d1982ee3c525bdb9a8836a4d4c5d5d7a7e69de49150a82c07d6e" exitCode=0 Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.217470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerDied","Data":"7cd65025ec15d1982ee3c525bdb9a8836a4d4c5d5d7a7e69de49150a82c07d6e"} Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.217564 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-792qz" event={"ID":"54f3bcbc-c13d-49ab-8696-8e416cabcf74","Type":"ContainerDied","Data":"e63432b307c2e749e7810b5ce84588dc9ea1263586d309a45f604b5c52e637a2"} Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.217583 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e63432b307c2e749e7810b5ce84588dc9ea1263586d309a45f604b5c52e637a2" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.313113 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.505850 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities\") pod \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.505963 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znn4c\" (UniqueName: \"kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c\") pod \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.506269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content\") pod \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\" (UID: \"54f3bcbc-c13d-49ab-8696-8e416cabcf74\") " Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.507468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities" (OuterVolumeSpecName: "utilities") pod "54f3bcbc-c13d-49ab-8696-8e416cabcf74" (UID: "54f3bcbc-c13d-49ab-8696-8e416cabcf74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.523584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c" (OuterVolumeSpecName: "kube-api-access-znn4c") pod "54f3bcbc-c13d-49ab-8696-8e416cabcf74" (UID: "54f3bcbc-c13d-49ab-8696-8e416cabcf74"). InnerVolumeSpecName "kube-api-access-znn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.533756 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54f3bcbc-c13d-49ab-8696-8e416cabcf74" (UID: "54f3bcbc-c13d-49ab-8696-8e416cabcf74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.609536 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.609581 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54f3bcbc-c13d-49ab-8696-8e416cabcf74-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:35 crc kubenswrapper[4795]: I1205 09:20:35.609593 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znn4c\" (UniqueName: \"kubernetes.io/projected/54f3bcbc-c13d-49ab-8696-8e416cabcf74-kube-api-access-znn4c\") on node \"crc\" DevicePath \"\"" Dec 05 09:20:36 crc kubenswrapper[4795]: I1205 09:20:36.243935 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-792qz" Dec 05 09:20:36 crc kubenswrapper[4795]: I1205 09:20:36.307189 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:36 crc kubenswrapper[4795]: I1205 09:20:36.320253 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-792qz"] Dec 05 09:20:36 crc kubenswrapper[4795]: I1205 09:20:36.762891 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" path="/var/lib/kubelet/pods/54f3bcbc-c13d-49ab-8696-8e416cabcf74/volumes" Dec 05 09:20:39 crc kubenswrapper[4795]: I1205 09:20:39.945852 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" podUID="2297e0a2-10ff-47d9-8acf-c94bf4bddc9f" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.48:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:20:39 crc kubenswrapper[4795]: I1205 09:20:39.945938 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-q56zl" podUID="2297e0a2-10ff-47d9-8acf-c94bf4bddc9f" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.48:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:20:39 crc kubenswrapper[4795]: I1205 09:20:39.989187 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-q5zwr" podUID="74cd8f10-9003-46be-992c-2b23202839bb" containerName="registry-server" probeResult="failure" output=< Dec 05 09:20:39 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:20:39 crc kubenswrapper[4795]: > Dec 05 09:20:46 crc kubenswrapper[4795]: I1205 09:20:46.748240 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:20:46 crc kubenswrapper[4795]: E1205 09:20:46.749286 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:20:58 crc kubenswrapper[4795]: I1205 09:20:58.755686 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:20:58 crc kubenswrapper[4795]: E1205 09:20:58.756925 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:21:08 crc kubenswrapper[4795]: I1205 09:21:08.551917 4795 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v8qz8 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:21:08 crc kubenswrapper[4795]: I1205 09:21:08.552862 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" podUID="62d33db5-212f-4884-b78b-159f06592142" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 09:21:12 crc kubenswrapper[4795]: I1205 09:21:12.747667 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:21:13 crc kubenswrapper[4795]: I1205 09:21:13.644264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19"} Dec 05 09:22:17 crc kubenswrapper[4795]: I1205 09:22:17.681973 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-jcbss" podUID="4d920ea1-76ae-4bb3-831f-e83ac4d57fbe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.67:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.359129 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:22:18 crc kubenswrapper[4795]: E1205 09:22:18.359755 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="extract-content" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.359781 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="extract-content" Dec 05 09:22:18 crc kubenswrapper[4795]: E1205 09:22:18.359835 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="registry-server" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.359844 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="registry-server" Dec 05 09:22:18 crc kubenswrapper[4795]: E1205 09:22:18.359864 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="extract-utilities" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.359874 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="extract-utilities" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.360153 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="54f3bcbc-c13d-49ab-8696-8e416cabcf74" containerName="registry-server" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.362865 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.372818 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.496483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wl9d\" (UniqueName: \"kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.496747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.496832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.599339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wl9d\" (UniqueName: \"kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.600473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.600600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.601097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:18 crc kubenswrapper[4795]: I1205 09:22:18.601353 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:19 crc kubenswrapper[4795]: I1205 09:22:19.107262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wl9d\" (UniqueName: \"kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d\") pod \"redhat-operators-xtlbx\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:19 crc kubenswrapper[4795]: I1205 09:22:19.293576 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:20 crc kubenswrapper[4795]: I1205 09:22:20.094784 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:22:20 crc kubenswrapper[4795]: I1205 09:22:20.452394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerStarted","Data":"46aadd1f6397b211ea575e53f9d0c5503a11534296c7f6d876bb7e08a12724fd"} Dec 05 09:22:20 crc kubenswrapper[4795]: I1205 09:22:20.452940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerStarted","Data":"20a87e15a9d7d7a7fd6ef4aa9f03cafaa2f7fdfcd6f13ca131fa01601726966d"} Dec 05 09:22:21 crc kubenswrapper[4795]: I1205 09:22:21.471710 4795 generic.go:334] "Generic (PLEG): container finished" podID="72b07569-2c12-4f33-bbda-521b0e061be7" containerID="46aadd1f6397b211ea575e53f9d0c5503a11534296c7f6d876bb7e08a12724fd" exitCode=0 Dec 05 09:22:21 crc kubenswrapper[4795]: I1205 09:22:21.472783 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerDied","Data":"46aadd1f6397b211ea575e53f9d0c5503a11534296c7f6d876bb7e08a12724fd"} Dec 05 09:22:21 crc kubenswrapper[4795]: I1205 09:22:21.478684 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:22:26 crc kubenswrapper[4795]: I1205 09:22:26.528714 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerStarted","Data":"b4a3f5dbfdd8375184021a5ce73aaa9c624496c2673f8558078be6eb749fb518"} Dec 05 09:22:31 crc kubenswrapper[4795]: I1205 09:22:31.473860 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-jsnlr" podUID="3b63dece-6484-4464-b6a2-c8dcdbb34eae" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:22:34 crc kubenswrapper[4795]: I1205 09:22:34.213899 4795 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-6fkrg container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:22:34 crc kubenswrapper[4795]: I1205 09:22:34.214384 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6fkrg" podUID="ea39bb65-aae0-48fe-ae6a-4736ab5cf336" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 09:22:44 crc kubenswrapper[4795]: I1205 09:22:44.753397 4795 generic.go:334] "Generic (PLEG): container finished" podID="72b07569-2c12-4f33-bbda-521b0e061be7" containerID="b4a3f5dbfdd8375184021a5ce73aaa9c624496c2673f8558078be6eb749fb518" exitCode=0 Dec 05 09:22:44 crc kubenswrapper[4795]: I1205 09:22:44.765882 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerDied","Data":"b4a3f5dbfdd8375184021a5ce73aaa9c624496c2673f8558078be6eb749fb518"} Dec 05 09:22:48 crc kubenswrapper[4795]: I1205 09:22:48.800341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerStarted","Data":"5f3afbc1abc614a1225d6418b4b640609b5495b3a1e079b5df22594a71973795"} Dec 05 09:22:48 crc kubenswrapper[4795]: I1205 09:22:48.836025 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xtlbx" podStartSLOduration=4.164431444 podStartE2EDuration="30.835992555s" podCreationTimestamp="2025-12-05 09:22:18 +0000 UTC" firstStartedPulling="2025-12-05 09:22:21.476819935 +0000 UTC m=+3493.049423674" lastFinishedPulling="2025-12-05 09:22:48.148381046 +0000 UTC m=+3519.720984785" observedRunningTime="2025-12-05 09:22:48.825898592 +0000 UTC m=+3520.398502331" watchObservedRunningTime="2025-12-05 09:22:48.835992555 +0000 UTC m=+3520.408596294" Dec 05 09:22:49 crc kubenswrapper[4795]: I1205 09:22:49.294466 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:49 crc kubenswrapper[4795]: I1205 09:22:49.295016 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:50 crc kubenswrapper[4795]: I1205 09:22:50.349771 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xtlbx" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="registry-server" probeResult="failure" output=< Dec 05 09:22:50 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:22:50 crc kubenswrapper[4795]: > Dec 05 09:22:53 crc kubenswrapper[4795]: I1205 09:22:53.002776 4795 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.308002382s: [/var/lib/containers/storage/overlay/7484e0d8f3b7e69a13e6d1791295a2f4128e67f771155cc8a791712ba9decffd/diff /var/log/pods/openstack_placement-56588789f4-7xbdx_40ecc6c1-814a-40dc-988b-d4b67a58794b/placement-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 05 09:22:59 crc kubenswrapper[4795]: I1205 09:22:59.354855 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:59 crc kubenswrapper[4795]: I1205 09:22:59.450655 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:22:59 crc kubenswrapper[4795]: I1205 09:22:59.606832 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:23:00 crc kubenswrapper[4795]: I1205 09:23:00.927320 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xtlbx" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="registry-server" containerID="cri-o://5f3afbc1abc614a1225d6418b4b640609b5495b3a1e079b5df22594a71973795" gracePeriod=2 Dec 05 09:23:01 crc kubenswrapper[4795]: I1205 09:23:01.939720 4795 generic.go:334] "Generic (PLEG): container finished" podID="72b07569-2c12-4f33-bbda-521b0e061be7" containerID="5f3afbc1abc614a1225d6418b4b640609b5495b3a1e079b5df22594a71973795" exitCode=0 Dec 05 09:23:01 crc kubenswrapper[4795]: I1205 09:23:01.939814 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerDied","Data":"5f3afbc1abc614a1225d6418b4b640609b5495b3a1e079b5df22594a71973795"} Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.626575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.784564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content\") pod \"72b07569-2c12-4f33-bbda-521b0e061be7\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.785256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wl9d\" (UniqueName: \"kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d\") pod \"72b07569-2c12-4f33-bbda-521b0e061be7\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.785331 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities\") pod \"72b07569-2c12-4f33-bbda-521b0e061be7\" (UID: \"72b07569-2c12-4f33-bbda-521b0e061be7\") " Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.786876 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities" (OuterVolumeSpecName: "utilities") pod "72b07569-2c12-4f33-bbda-521b0e061be7" (UID: "72b07569-2c12-4f33-bbda-521b0e061be7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.850110 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d" (OuterVolumeSpecName: "kube-api-access-9wl9d") pod "72b07569-2c12-4f33-bbda-521b0e061be7" (UID: "72b07569-2c12-4f33-bbda-521b0e061be7"). InnerVolumeSpecName "kube-api-access-9wl9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.888516 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wl9d\" (UniqueName: \"kubernetes.io/projected/72b07569-2c12-4f33-bbda-521b0e061be7-kube-api-access-9wl9d\") on node \"crc\" DevicePath \"\"" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.888569 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.929414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72b07569-2c12-4f33-bbda-521b0e061be7" (UID: "72b07569-2c12-4f33-bbda-521b0e061be7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.977845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xtlbx" event={"ID":"72b07569-2c12-4f33-bbda-521b0e061be7","Type":"ContainerDied","Data":"20a87e15a9d7d7a7fd6ef4aa9f03cafaa2f7fdfcd6f13ca131fa01601726966d"} Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.977931 4795 scope.go:117] "RemoveContainer" containerID="5f3afbc1abc614a1225d6418b4b640609b5495b3a1e079b5df22594a71973795" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.978175 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xtlbx" Dec 05 09:23:04 crc kubenswrapper[4795]: I1205 09:23:04.991473 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72b07569-2c12-4f33-bbda-521b0e061be7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:23:05 crc kubenswrapper[4795]: I1205 09:23:05.029965 4795 scope.go:117] "RemoveContainer" containerID="b4a3f5dbfdd8375184021a5ce73aaa9c624496c2673f8558078be6eb749fb518" Dec 05 09:23:05 crc kubenswrapper[4795]: I1205 09:23:05.037276 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:23:05 crc kubenswrapper[4795]: I1205 09:23:05.059279 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xtlbx"] Dec 05 09:23:05 crc kubenswrapper[4795]: I1205 09:23:05.088041 4795 scope.go:117] "RemoveContainer" containerID="46aadd1f6397b211ea575e53f9d0c5503a11534296c7f6d876bb7e08a12724fd" Dec 05 09:23:06 crc kubenswrapper[4795]: I1205 09:23:06.761251 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" path="/var/lib/kubelet/pods/72b07569-2c12-4f33-bbda-521b0e061be7/volumes" Dec 05 09:23:40 crc kubenswrapper[4795]: I1205 09:23:40.826942 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:23:40 crc kubenswrapper[4795]: I1205 09:23:40.827566 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:24:10 crc kubenswrapper[4795]: I1205 09:24:10.828020 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:24:10 crc kubenswrapper[4795]: I1205 09:24:10.829183 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:24:40 crc kubenswrapper[4795]: I1205 09:24:40.827295 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:24:40 crc kubenswrapper[4795]: I1205 09:24:40.829499 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:24:40 crc kubenswrapper[4795]: I1205 09:24:40.829667 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:24:40 crc kubenswrapper[4795]: I1205 09:24:40.830693 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:24:40 crc kubenswrapper[4795]: I1205 09:24:40.830986 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19" gracePeriod=600 Dec 05 09:24:41 crc kubenswrapper[4795]: I1205 09:24:41.261569 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19" exitCode=0 Dec 05 09:24:41 crc kubenswrapper[4795]: I1205 09:24:41.261697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19"} Dec 05 09:24:41 crc kubenswrapper[4795]: I1205 09:24:41.262098 4795 scope.go:117] "RemoveContainer" containerID="19cc122ddf2dbb8ec96f687a268ab31822c64af936f0c61453d75ba61f80ad64" Dec 05 09:24:43 crc kubenswrapper[4795]: I1205 09:24:43.290686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7"} Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.938401 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:25:40 crc kubenswrapper[4795]: E1205 09:25:40.939731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="extract-utilities" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.939753 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="extract-utilities" Dec 05 09:25:40 crc kubenswrapper[4795]: E1205 09:25:40.939781 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="registry-server" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.939789 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="registry-server" Dec 05 09:25:40 crc kubenswrapper[4795]: E1205 09:25:40.939815 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="extract-content" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.939822 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="extract-content" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.940009 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b07569-2c12-4f33-bbda-521b0e061be7" containerName="registry-server" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.944136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:40 crc kubenswrapper[4795]: I1205 09:25:40.952525 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.012108 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.012622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl4w8\" (UniqueName: \"kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.012695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.115379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.115511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl4w8\" (UniqueName: \"kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.115632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.116471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.116577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.144452 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl4w8\" (UniqueName: \"kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8\") pod \"certified-operators-5frtg\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.271858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.901896 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:25:41 crc kubenswrapper[4795]: I1205 09:25:41.996922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerStarted","Data":"2b7fb132f26a0a1547c84849cb84d5eb297d64a7d6b0c5677e37672260badea5"} Dec 05 09:25:43 crc kubenswrapper[4795]: I1205 09:25:43.011432 4795 generic.go:334] "Generic (PLEG): container finished" podID="d6a54d26-76e4-482d-bd4c-237f605237af" containerID="07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4" exitCode=0 Dec 05 09:25:43 crc kubenswrapper[4795]: I1205 09:25:43.011526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerDied","Data":"07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4"} Dec 05 09:25:44 crc kubenswrapper[4795]: I1205 09:25:44.029124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerStarted","Data":"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655"} Dec 05 09:25:46 crc kubenswrapper[4795]: I1205 09:25:46.053018 4795 generic.go:334] "Generic (PLEG): container finished" podID="d6a54d26-76e4-482d-bd4c-237f605237af" containerID="3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655" exitCode=0 Dec 05 09:25:46 crc kubenswrapper[4795]: I1205 09:25:46.053129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerDied","Data":"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655"} Dec 05 09:25:50 crc kubenswrapper[4795]: I1205 09:25:50.172242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerStarted","Data":"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1"} Dec 05 09:25:50 crc kubenswrapper[4795]: I1205 09:25:50.230531 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5frtg" podStartSLOduration=3.589113472 podStartE2EDuration="10.230504547s" podCreationTimestamp="2025-12-05 09:25:40 +0000 UTC" firstStartedPulling="2025-12-05 09:25:43.014543815 +0000 UTC m=+3694.587147564" lastFinishedPulling="2025-12-05 09:25:49.6559349 +0000 UTC m=+3701.228538639" observedRunningTime="2025-12-05 09:25:50.214501025 +0000 UTC m=+3701.787104764" watchObservedRunningTime="2025-12-05 09:25:50.230504547 +0000 UTC m=+3701.803108286" Dec 05 09:25:51 crc kubenswrapper[4795]: I1205 09:25:51.272018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:51 crc kubenswrapper[4795]: I1205 09:25:51.272858 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:25:52 crc kubenswrapper[4795]: I1205 09:25:52.357027 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-5frtg" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="registry-server" probeResult="failure" output=< Dec 05 09:25:52 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:25:52 crc kubenswrapper[4795]: > Dec 05 09:26:01 crc kubenswrapper[4795]: I1205 09:26:01.329189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:26:01 crc kubenswrapper[4795]: I1205 09:26:01.391063 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:26:01 crc kubenswrapper[4795]: I1205 09:26:01.574724 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:26:03 crc kubenswrapper[4795]: I1205 09:26:03.324219 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5frtg" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="registry-server" containerID="cri-o://f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1" gracePeriod=2 Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.126811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.211371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities\") pod \"d6a54d26-76e4-482d-bd4c-237f605237af\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.211576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content\") pod \"d6a54d26-76e4-482d-bd4c-237f605237af\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.211709 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl4w8\" (UniqueName: \"kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8\") pod \"d6a54d26-76e4-482d-bd4c-237f605237af\" (UID: \"d6a54d26-76e4-482d-bd4c-237f605237af\") " Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.215931 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities" (OuterVolumeSpecName: "utilities") pod "d6a54d26-76e4-482d-bd4c-237f605237af" (UID: "d6a54d26-76e4-482d-bd4c-237f605237af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.236907 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8" (OuterVolumeSpecName: "kube-api-access-kl4w8") pod "d6a54d26-76e4-482d-bd4c-237f605237af" (UID: "d6a54d26-76e4-482d-bd4c-237f605237af"). InnerVolumeSpecName "kube-api-access-kl4w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.314961 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl4w8\" (UniqueName: \"kubernetes.io/projected/d6a54d26-76e4-482d-bd4c-237f605237af-kube-api-access-kl4w8\") on node \"crc\" DevicePath \"\"" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.315004 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.317049 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6a54d26-76e4-482d-bd4c-237f605237af" (UID: "d6a54d26-76e4-482d-bd4c-237f605237af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.346638 4795 generic.go:334] "Generic (PLEG): container finished" podID="d6a54d26-76e4-482d-bd4c-237f605237af" containerID="f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1" exitCode=0 Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.346705 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5frtg" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.346706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerDied","Data":"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1"} Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.346858 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5frtg" event={"ID":"d6a54d26-76e4-482d-bd4c-237f605237af","Type":"ContainerDied","Data":"2b7fb132f26a0a1547c84849cb84d5eb297d64a7d6b0c5677e37672260badea5"} Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.346882 4795 scope.go:117] "RemoveContainer" containerID="f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.397299 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.403652 4795 scope.go:117] "RemoveContainer" containerID="3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.409486 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5frtg"] Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.419123 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a54d26-76e4-482d-bd4c-237f605237af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.455069 4795 scope.go:117] "RemoveContainer" containerID="07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.501529 4795 scope.go:117] "RemoveContainer" containerID="f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1" Dec 05 09:26:04 crc kubenswrapper[4795]: E1205 09:26:04.502403 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1\": container with ID starting with f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1 not found: ID does not exist" containerID="f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.502446 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1"} err="failed to get container status \"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1\": rpc error: code = NotFound desc = could not find container \"f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1\": container with ID starting with f4263208cae85f0cb8a5ddc83672160749a1f78f7eb00adb812f83bcbf9afdb1 not found: ID does not exist" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.502481 4795 scope.go:117] "RemoveContainer" containerID="3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655" Dec 05 09:26:04 crc kubenswrapper[4795]: E1205 09:26:04.503233 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655\": container with ID starting with 3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655 not found: ID does not exist" containerID="3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.503283 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655"} err="failed to get container status \"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655\": rpc error: code = NotFound desc = could not find container \"3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655\": container with ID starting with 3c034463ca4c1bace5bef2b6ca486f68ba318de67befa1d89d4ffba48265f655 not found: ID does not exist" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.503300 4795 scope.go:117] "RemoveContainer" containerID="07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4" Dec 05 09:26:04 crc kubenswrapper[4795]: E1205 09:26:04.503564 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4\": container with ID starting with 07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4 not found: ID does not exist" containerID="07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.503592 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4"} err="failed to get container status \"07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4\": rpc error: code = NotFound desc = could not find container \"07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4\": container with ID starting with 07dd923a442b66f3cc45ee0fd473869005d5810ad1ba69f8b46956ffba1e7af4 not found: ID does not exist" Dec 05 09:26:04 crc kubenswrapper[4795]: I1205 09:26:04.760394 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" path="/var/lib/kubelet/pods/d6a54d26-76e4-482d-bd4c-237f605237af/volumes" Dec 05 09:26:48 crc kubenswrapper[4795]: I1205 09:26:48.343975 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-d7l5q container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:26:48 crc kubenswrapper[4795]: I1205 09:26:48.346254 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-d7l5q" podUID="d28c9743-ac3d-478a-8b4d-92510027278f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 09:26:48 crc kubenswrapper[4795]: I1205 09:26:48.964272 4795 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:26:48 crc kubenswrapper[4795]: I1205 09:26:48.964787 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:26:53 crc kubenswrapper[4795]: I1205 09:26:53.875629 4795 scope.go:117] "RemoveContainer" containerID="7cd65025ec15d1982ee3c525bdb9a8836a4d4c5d5d7a7e69de49150a82c07d6e" Dec 05 09:26:53 crc kubenswrapper[4795]: I1205 09:26:53.902494 4795 scope.go:117] "RemoveContainer" containerID="55bc5f30c4c40e4f58a4498919c8a488b367b006619204165bfbae36b3f7064b" Dec 05 09:26:53 crc kubenswrapper[4795]: I1205 09:26:53.955471 4795 scope.go:117] "RemoveContainer" containerID="cbd45d9f13a1a3802bac366bed6fbbcf5bd40bdce0bbec04f16b886d2e8f5113" Dec 05 09:27:10 crc kubenswrapper[4795]: I1205 09:27:10.827353 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:27:10 crc kubenswrapper[4795]: I1205 09:27:10.828201 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:27:40 crc kubenswrapper[4795]: I1205 09:27:40.827348 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:27:40 crc kubenswrapper[4795]: I1205 09:27:40.828145 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:28:10 crc kubenswrapper[4795]: I1205 09:28:10.827829 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:28:10 crc kubenswrapper[4795]: I1205 09:28:10.828566 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:28:10 crc kubenswrapper[4795]: I1205 09:28:10.828643 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:28:10 crc kubenswrapper[4795]: I1205 09:28:10.829470 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:28:10 crc kubenswrapper[4795]: I1205 09:28:10.829530 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" gracePeriod=600 Dec 05 09:28:11 crc kubenswrapper[4795]: E1205 09:28:11.808197 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:28:11 crc kubenswrapper[4795]: I1205 09:28:11.816781 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" exitCode=0 Dec 05 09:28:11 crc kubenswrapper[4795]: I1205 09:28:11.816857 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7"} Dec 05 09:28:11 crc kubenswrapper[4795]: I1205 09:28:11.816936 4795 scope.go:117] "RemoveContainer" containerID="d5ad1030c9b48aa0364f4169ee5c00a5a766cdeaa9291c246f42015eff577f19" Dec 05 09:28:11 crc kubenswrapper[4795]: I1205 09:28:11.818201 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:28:11 crc kubenswrapper[4795]: E1205 09:28:11.818631 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:28:23 crc kubenswrapper[4795]: I1205 09:28:23.747728 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:28:23 crc kubenswrapper[4795]: E1205 09:28:23.748745 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:28:34 crc kubenswrapper[4795]: I1205 09:28:34.747883 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:28:34 crc kubenswrapper[4795]: E1205 09:28:34.748865 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:28:45 crc kubenswrapper[4795]: I1205 09:28:45.748779 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:28:45 crc kubenswrapper[4795]: E1205 09:28:45.749983 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:28:56 crc kubenswrapper[4795]: I1205 09:28:56.747465 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:28:56 crc kubenswrapper[4795]: E1205 09:28:56.749805 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:29:11 crc kubenswrapper[4795]: I1205 09:29:11.748660 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:29:11 crc kubenswrapper[4795]: E1205 09:29:11.749599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:29:26 crc kubenswrapper[4795]: I1205 09:29:26.754224 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:29:26 crc kubenswrapper[4795]: E1205 09:29:26.755212 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:29:41 crc kubenswrapper[4795]: I1205 09:29:41.749021 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:29:41 crc kubenswrapper[4795]: E1205 09:29:41.750142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.490303 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:29:45 crc kubenswrapper[4795]: E1205 09:29:45.491466 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="extract-content" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.491486 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="extract-content" Dec 05 09:29:45 crc kubenswrapper[4795]: E1205 09:29:45.491522 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="extract-utilities" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.491529 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="extract-utilities" Dec 05 09:29:45 crc kubenswrapper[4795]: E1205 09:29:45.491558 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="registry-server" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.491564 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="registry-server" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.491759 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a54d26-76e4-482d-bd4c-237f605237af" containerName="registry-server" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.493330 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.522971 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.611144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lnd9\" (UniqueName: \"kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.611224 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.611359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.714649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lnd9\" (UniqueName: \"kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.714734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.714757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.715390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.715427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.745998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lnd9\" (UniqueName: \"kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9\") pod \"community-operators-5s2pq\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:45 crc kubenswrapper[4795]: I1205 09:29:45.822822 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:46 crc kubenswrapper[4795]: I1205 09:29:46.553883 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:29:46 crc kubenswrapper[4795]: I1205 09:29:46.910176 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerStarted","Data":"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486"} Dec 05 09:29:46 crc kubenswrapper[4795]: I1205 09:29:46.910630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerStarted","Data":"e27438f18c7384717817ec3953f8532f90a0e11966d1c2ddea59a3d86a535277"} Dec 05 09:29:47 crc kubenswrapper[4795]: I1205 09:29:47.923065 4795 generic.go:334] "Generic (PLEG): container finished" podID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerID="0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486" exitCode=0 Dec 05 09:29:47 crc kubenswrapper[4795]: I1205 09:29:47.923153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerDied","Data":"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486"} Dec 05 09:29:47 crc kubenswrapper[4795]: I1205 09:29:47.928120 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:29:48 crc kubenswrapper[4795]: I1205 09:29:48.943890 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerStarted","Data":"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3"} Dec 05 09:29:51 crc kubenswrapper[4795]: I1205 09:29:51.981683 4795 generic.go:334] "Generic (PLEG): container finished" podID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerID="9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3" exitCode=0 Dec 05 09:29:51 crc kubenswrapper[4795]: I1205 09:29:51.981763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerDied","Data":"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3"} Dec 05 09:29:52 crc kubenswrapper[4795]: I1205 09:29:52.748209 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:29:52 crc kubenswrapper[4795]: E1205 09:29:52.749377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:29:53 crc kubenswrapper[4795]: I1205 09:29:53.002637 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerStarted","Data":"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca"} Dec 05 09:29:55 crc kubenswrapper[4795]: I1205 09:29:55.822973 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:55 crc kubenswrapper[4795]: I1205 09:29:55.823829 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:29:56 crc kubenswrapper[4795]: I1205 09:29:56.881930 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5s2pq" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="registry-server" probeResult="failure" output=< Dec 05 09:29:56 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:29:56 crc kubenswrapper[4795]: > Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.204020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5s2pq" podStartSLOduration=10.646415708 podStartE2EDuration="15.203986556s" podCreationTimestamp="2025-12-05 09:29:45 +0000 UTC" firstStartedPulling="2025-12-05 09:29:47.927849267 +0000 UTC m=+3939.500453006" lastFinishedPulling="2025-12-05 09:29:52.485420115 +0000 UTC m=+3944.058023854" observedRunningTime="2025-12-05 09:29:53.037288444 +0000 UTC m=+3944.609892183" watchObservedRunningTime="2025-12-05 09:30:00.203986556 +0000 UTC m=+3951.776590285" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.207410 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj"] Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.209170 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.213872 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.216716 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.219285 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj"] Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.261437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.261811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.261908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5dgs\" (UniqueName: \"kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.368752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.369372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.369595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5dgs\" (UniqueName: \"kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.371277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.387654 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.436648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5dgs\" (UniqueName: \"kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs\") pod \"collect-profiles-29415450-5q8dj\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:00 crc kubenswrapper[4795]: I1205 09:30:00.540706 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:01 crc kubenswrapper[4795]: I1205 09:30:01.366999 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj"] Dec 05 09:30:01 crc kubenswrapper[4795]: W1205 09:30:01.376571 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22400191_14f2_40bd_a5a8_f353664cc4b9.slice/crio-791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65 WatchSource:0}: Error finding container 791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65: Status 404 returned error can't find the container with id 791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65 Dec 05 09:30:02 crc kubenswrapper[4795]: I1205 09:30:02.103100 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" event={"ID":"22400191-14f2-40bd-a5a8-f353664cc4b9","Type":"ContainerStarted","Data":"1348ab21a38dd3c883d80dc1d2d3fbcdb3a1238d5c34dd6269a1838136bc1184"} Dec 05 09:30:02 crc kubenswrapper[4795]: I1205 09:30:02.103538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" event={"ID":"22400191-14f2-40bd-a5a8-f353664cc4b9","Type":"ContainerStarted","Data":"791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65"} Dec 05 09:30:02 crc kubenswrapper[4795]: I1205 09:30:02.224781 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" podStartSLOduration=2.224754755 podStartE2EDuration="2.224754755s" podCreationTimestamp="2025-12-05 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 09:30:02.191014102 +0000 UTC m=+3953.763617841" watchObservedRunningTime="2025-12-05 09:30:02.224754755 +0000 UTC m=+3953.797358494" Dec 05 09:30:03 crc kubenswrapper[4795]: I1205 09:30:03.115000 4795 generic.go:334] "Generic (PLEG): container finished" podID="22400191-14f2-40bd-a5a8-f353664cc4b9" containerID="1348ab21a38dd3c883d80dc1d2d3fbcdb3a1238d5c34dd6269a1838136bc1184" exitCode=0 Dec 05 09:30:03 crc kubenswrapper[4795]: I1205 09:30:03.115064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" event={"ID":"22400191-14f2-40bd-a5a8-f353664cc4b9","Type":"ContainerDied","Data":"1348ab21a38dd3c883d80dc1d2d3fbcdb3a1238d5c34dd6269a1838136bc1184"} Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.697702 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.748406 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:30:04 crc kubenswrapper[4795]: E1205 09:30:04.748641 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.794559 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume\") pod \"22400191-14f2-40bd-a5a8-f353664cc4b9\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.794682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5dgs\" (UniqueName: \"kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs\") pod \"22400191-14f2-40bd-a5a8-f353664cc4b9\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.794863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume\") pod \"22400191-14f2-40bd-a5a8-f353664cc4b9\" (UID: \"22400191-14f2-40bd-a5a8-f353664cc4b9\") " Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.795385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume" (OuterVolumeSpecName: "config-volume") pod "22400191-14f2-40bd-a5a8-f353664cc4b9" (UID: "22400191-14f2-40bd-a5a8-f353664cc4b9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.807950 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22400191-14f2-40bd-a5a8-f353664cc4b9" (UID: "22400191-14f2-40bd-a5a8-f353664cc4b9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.812195 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs" (OuterVolumeSpecName: "kube-api-access-r5dgs") pod "22400191-14f2-40bd-a5a8-f353664cc4b9" (UID: "22400191-14f2-40bd-a5a8-f353664cc4b9"). InnerVolumeSpecName "kube-api-access-r5dgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.904640 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22400191-14f2-40bd-a5a8-f353664cc4b9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.905164 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5dgs\" (UniqueName: \"kubernetes.io/projected/22400191-14f2-40bd-a5a8-f353664cc4b9-kube-api-access-r5dgs\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:04 crc kubenswrapper[4795]: I1205 09:30:04.905199 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22400191-14f2-40bd-a5a8-f353664cc4b9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.139572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" event={"ID":"22400191-14f2-40bd-a5a8-f353664cc4b9","Type":"ContainerDied","Data":"791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65"} Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.139643 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791623b9b7073c05c64513fd91e631fda910ab6138fa776b59c3b300fa694a65" Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.139714 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415450-5q8dj" Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.816337 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b"] Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.835177 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415405-dct2b"] Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.908877 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:30:05 crc kubenswrapper[4795]: I1205 09:30:05.968923 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:30:06 crc kubenswrapper[4795]: I1205 09:30:06.162593 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:30:06 crc kubenswrapper[4795]: I1205 09:30:06.763334 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="286e1852-50a3-4f67-9588-faf932b0d456" path="/var/lib/kubelet/pods/286e1852-50a3-4f67-9588-faf932b0d456/volumes" Dec 05 09:30:07 crc kubenswrapper[4795]: I1205 09:30:07.174456 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5s2pq" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="registry-server" containerID="cri-o://7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca" gracePeriod=2 Dec 05 09:30:07 crc kubenswrapper[4795]: I1205 09:30:07.987765 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.079147 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lnd9\" (UniqueName: \"kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9\") pod \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.079303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities\") pod \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.079336 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content\") pod \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\" (UID: \"092ca8b6-19f2-40a8-a4bb-f1924982e4db\") " Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.083799 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities" (OuterVolumeSpecName: "utilities") pod "092ca8b6-19f2-40a8-a4bb-f1924982e4db" (UID: "092ca8b6-19f2-40a8-a4bb-f1924982e4db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.091989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9" (OuterVolumeSpecName: "kube-api-access-4lnd9") pod "092ca8b6-19f2-40a8-a4bb-f1924982e4db" (UID: "092ca8b6-19f2-40a8-a4bb-f1924982e4db"). InnerVolumeSpecName "kube-api-access-4lnd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.152095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "092ca8b6-19f2-40a8-a4bb-f1924982e4db" (UID: "092ca8b6-19f2-40a8-a4bb-f1924982e4db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.182127 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lnd9\" (UniqueName: \"kubernetes.io/projected/092ca8b6-19f2-40a8-a4bb-f1924982e4db-kube-api-access-4lnd9\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.182171 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.182186 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/092ca8b6-19f2-40a8-a4bb-f1924982e4db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.188431 4795 generic.go:334] "Generic (PLEG): container finished" podID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerID="7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca" exitCode=0 Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.188489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerDied","Data":"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca"} Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.188548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5s2pq" event={"ID":"092ca8b6-19f2-40a8-a4bb-f1924982e4db","Type":"ContainerDied","Data":"e27438f18c7384717817ec3953f8532f90a0e11966d1c2ddea59a3d86a535277"} Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.188572 4795 scope.go:117] "RemoveContainer" containerID="7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.188989 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5s2pq" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.219238 4795 scope.go:117] "RemoveContainer" containerID="9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.247973 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.258441 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5s2pq"] Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.307523 4795 scope.go:117] "RemoveContainer" containerID="0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.332022 4795 scope.go:117] "RemoveContainer" containerID="7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca" Dec 05 09:30:08 crc kubenswrapper[4795]: E1205 09:30:08.333116 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca\": container with ID starting with 7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca not found: ID does not exist" containerID="7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.333189 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca"} err="failed to get container status \"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca\": rpc error: code = NotFound desc = could not find container \"7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca\": container with ID starting with 7ef83c36c842fc7e6c5b73dd50ad15b03363408440d681ae4ae1f4fcf60174ca not found: ID does not exist" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.333832 4795 scope.go:117] "RemoveContainer" containerID="9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3" Dec 05 09:30:08 crc kubenswrapper[4795]: E1205 09:30:08.336107 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3\": container with ID starting with 9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3 not found: ID does not exist" containerID="9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.336137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3"} err="failed to get container status \"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3\": rpc error: code = NotFound desc = could not find container \"9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3\": container with ID starting with 9cdd54231444f16b3f57770f43eeeffb9a483c98394c0bfdc40ad471672f52f3 not found: ID does not exist" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.336165 4795 scope.go:117] "RemoveContainer" containerID="0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486" Dec 05 09:30:08 crc kubenswrapper[4795]: E1205 09:30:08.336540 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486\": container with ID starting with 0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486 not found: ID does not exist" containerID="0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.336567 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486"} err="failed to get container status \"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486\": rpc error: code = NotFound desc = could not find container \"0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486\": container with ID starting with 0c5b0a82ede8bfeea0f6e91654bfb8417b8b10c11f1b801a121d04985267e486 not found: ID does not exist" Dec 05 09:30:08 crc kubenswrapper[4795]: I1205 09:30:08.764915 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" path="/var/lib/kubelet/pods/092ca8b6-19f2-40a8-a4bb-f1924982e4db/volumes" Dec 05 09:30:19 crc kubenswrapper[4795]: I1205 09:30:19.747327 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:30:19 crc kubenswrapper[4795]: E1205 09:30:19.748415 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:30:30 crc kubenswrapper[4795]: I1205 09:30:30.748027 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:30:30 crc kubenswrapper[4795]: E1205 09:30:30.749281 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:30:45 crc kubenswrapper[4795]: I1205 09:30:45.747795 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:30:45 crc kubenswrapper[4795]: E1205 09:30:45.748847 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:30:54 crc kubenswrapper[4795]: I1205 09:30:54.146281 4795 scope.go:117] "RemoveContainer" containerID="9fe1328145a5c8747c67de153cf214e883013a3aea00b96b0f3c8402dff90270" Dec 05 09:30:58 crc kubenswrapper[4795]: I1205 09:30:58.756095 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:30:58 crc kubenswrapper[4795]: E1205 09:30:58.757261 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:31:11 crc kubenswrapper[4795]: I1205 09:31:11.747759 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:31:11 crc kubenswrapper[4795]: E1205 09:31:11.748798 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:31:24 crc kubenswrapper[4795]: I1205 09:31:24.747718 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:31:24 crc kubenswrapper[4795]: E1205 09:31:24.748714 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.217006 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:36 crc kubenswrapper[4795]: E1205 09:31:36.222928 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22400191-14f2-40bd-a5a8-f353664cc4b9" containerName="collect-profiles" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.222970 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="22400191-14f2-40bd-a5a8-f353664cc4b9" containerName="collect-profiles" Dec 05 09:31:36 crc kubenswrapper[4795]: E1205 09:31:36.222996 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="extract-content" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.223009 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="extract-content" Dec 05 09:31:36 crc kubenswrapper[4795]: E1205 09:31:36.223041 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="registry-server" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.223048 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="registry-server" Dec 05 09:31:36 crc kubenswrapper[4795]: E1205 09:31:36.223064 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="extract-utilities" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.223074 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="extract-utilities" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.232404 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ca8b6-19f2-40a8-a4bb-f1924982e4db" containerName="registry-server" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.232500 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="22400191-14f2-40bd-a5a8-f353664cc4b9" containerName="collect-profiles" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.247364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.273789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.395934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.396698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.396822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82xpf\" (UniqueName: \"kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.500013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.500092 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.500129 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82xpf\" (UniqueName: \"kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.501404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.502092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.542239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82xpf\" (UniqueName: \"kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf\") pod \"redhat-marketplace-nfs6z\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:36 crc kubenswrapper[4795]: I1205 09:31:36.622947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:37 crc kubenswrapper[4795]: I1205 09:31:37.217926 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:37 crc kubenswrapper[4795]: I1205 09:31:37.337902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerStarted","Data":"cb616d1f18bc649acaab8d7cd09a8079176722b274f7d0fc4e907648ad1747bd"} Dec 05 09:31:38 crc kubenswrapper[4795]: I1205 09:31:38.349259 4795 generic.go:334] "Generic (PLEG): container finished" podID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerID="7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff" exitCode=0 Dec 05 09:31:38 crc kubenswrapper[4795]: I1205 09:31:38.349335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerDied","Data":"7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff"} Dec 05 09:31:39 crc kubenswrapper[4795]: I1205 09:31:39.748448 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:31:39 crc kubenswrapper[4795]: E1205 09:31:39.749202 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:31:40 crc kubenswrapper[4795]: I1205 09:31:40.391492 4795 generic.go:334] "Generic (PLEG): container finished" podID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerID="946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97" exitCode=0 Dec 05 09:31:40 crc kubenswrapper[4795]: I1205 09:31:40.391679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerDied","Data":"946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97"} Dec 05 09:31:41 crc kubenswrapper[4795]: I1205 09:31:41.404349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerStarted","Data":"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732"} Dec 05 09:31:41 crc kubenswrapper[4795]: I1205 09:31:41.429056 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nfs6z" podStartSLOduration=2.987568167 podStartE2EDuration="5.42903266s" podCreationTimestamp="2025-12-05 09:31:36 +0000 UTC" firstStartedPulling="2025-12-05 09:31:38.352300584 +0000 UTC m=+4049.924904363" lastFinishedPulling="2025-12-05 09:31:40.793765117 +0000 UTC m=+4052.366368856" observedRunningTime="2025-12-05 09:31:41.426232975 +0000 UTC m=+4052.998836714" watchObservedRunningTime="2025-12-05 09:31:41.42903266 +0000 UTC m=+4053.001636399" Dec 05 09:31:46 crc kubenswrapper[4795]: I1205 09:31:46.623587 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:46 crc kubenswrapper[4795]: I1205 09:31:46.624485 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:46 crc kubenswrapper[4795]: I1205 09:31:46.686890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:47 crc kubenswrapper[4795]: I1205 09:31:47.552805 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:47 crc kubenswrapper[4795]: I1205 09:31:47.651441 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:49 crc kubenswrapper[4795]: I1205 09:31:49.506794 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nfs6z" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="registry-server" containerID="cri-o://530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732" gracePeriod=2 Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.270763 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.440009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities\") pod \"65ed029a-5065-4d32-8b7c-06626c54f45f\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.440162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82xpf\" (UniqueName: \"kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf\") pod \"65ed029a-5065-4d32-8b7c-06626c54f45f\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.440237 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content\") pod \"65ed029a-5065-4d32-8b7c-06626c54f45f\" (UID: \"65ed029a-5065-4d32-8b7c-06626c54f45f\") " Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.440814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities" (OuterVolumeSpecName: "utilities") pod "65ed029a-5065-4d32-8b7c-06626c54f45f" (UID: "65ed029a-5065-4d32-8b7c-06626c54f45f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.447800 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf" (OuterVolumeSpecName: "kube-api-access-82xpf") pod "65ed029a-5065-4d32-8b7c-06626c54f45f" (UID: "65ed029a-5065-4d32-8b7c-06626c54f45f"). InnerVolumeSpecName "kube-api-access-82xpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.464193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65ed029a-5065-4d32-8b7c-06626c54f45f" (UID: "65ed029a-5065-4d32-8b7c-06626c54f45f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.521845 4795 generic.go:334] "Generic (PLEG): container finished" podID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerID="530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732" exitCode=0 Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.521901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerDied","Data":"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732"} Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.521940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfs6z" event={"ID":"65ed029a-5065-4d32-8b7c-06626c54f45f","Type":"ContainerDied","Data":"cb616d1f18bc649acaab8d7cd09a8079176722b274f7d0fc4e907648ad1747bd"} Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.521935 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfs6z" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.522023 4795 scope.go:117] "RemoveContainer" containerID="530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.542248 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.542282 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82xpf\" (UniqueName: \"kubernetes.io/projected/65ed029a-5065-4d32-8b7c-06626c54f45f-kube-api-access-82xpf\") on node \"crc\" DevicePath \"\"" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.542295 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65ed029a-5065-4d32-8b7c-06626c54f45f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.567960 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.573853 4795 scope.go:117] "RemoveContainer" containerID="946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.581411 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfs6z"] Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.629553 4795 scope.go:117] "RemoveContainer" containerID="7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.660738 4795 scope.go:117] "RemoveContainer" containerID="530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732" Dec 05 09:31:50 crc kubenswrapper[4795]: E1205 09:31:50.661501 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732\": container with ID starting with 530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732 not found: ID does not exist" containerID="530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.661595 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732"} err="failed to get container status \"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732\": rpc error: code = NotFound desc = could not find container \"530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732\": container with ID starting with 530803e0311f949ae46d7357c643a23fc87bc164120c457222017b5a1cdd8732 not found: ID does not exist" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.661657 4795 scope.go:117] "RemoveContainer" containerID="946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97" Dec 05 09:31:50 crc kubenswrapper[4795]: E1205 09:31:50.662275 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97\": container with ID starting with 946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97 not found: ID does not exist" containerID="946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.662305 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97"} err="failed to get container status \"946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97\": rpc error: code = NotFound desc = could not find container \"946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97\": container with ID starting with 946dcf9c312d7c05321831486bd86cb1c0a93a7acad54aaa2f33b84c45172d97 not found: ID does not exist" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.662329 4795 scope.go:117] "RemoveContainer" containerID="7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff" Dec 05 09:31:50 crc kubenswrapper[4795]: E1205 09:31:50.662679 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff\": container with ID starting with 7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff not found: ID does not exist" containerID="7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.662731 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff"} err="failed to get container status \"7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff\": rpc error: code = NotFound desc = could not find container \"7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff\": container with ID starting with 7677807abf700ed31b4d0d3eddc0f43ef801c9bf554e8950f53fc89487712cff not found: ID does not exist" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.748148 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:31:50 crc kubenswrapper[4795]: E1205 09:31:50.748467 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:31:50 crc kubenswrapper[4795]: I1205 09:31:50.759712 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" path="/var/lib/kubelet/pods/65ed029a-5065-4d32-8b7c-06626c54f45f/volumes" Dec 05 09:32:04 crc kubenswrapper[4795]: I1205 09:32:04.747770 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:32:04 crc kubenswrapper[4795]: E1205 09:32:04.748697 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:32:15 crc kubenswrapper[4795]: I1205 09:32:15.747230 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:32:15 crc kubenswrapper[4795]: E1205 09:32:15.748235 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.216650 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:23 crc kubenswrapper[4795]: E1205 09:32:23.218172 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="extract-content" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.218200 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="extract-content" Dec 05 09:32:23 crc kubenswrapper[4795]: E1205 09:32:23.218223 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="registry-server" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.218232 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="registry-server" Dec 05 09:32:23 crc kubenswrapper[4795]: E1205 09:32:23.218277 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="extract-utilities" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.218287 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="extract-utilities" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.218557 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ed029a-5065-4d32-8b7c-06626c54f45f" containerName="registry-server" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.220658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.222004 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.222076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4br\" (UniqueName: \"kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.222330 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.236582 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.324572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.324695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.324745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4br\" (UniqueName: \"kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.325315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.325400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.350516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4br\" (UniqueName: \"kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br\") pod \"redhat-operators-n94pj\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:23 crc kubenswrapper[4795]: I1205 09:32:23.544955 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:24 crc kubenswrapper[4795]: I1205 09:32:24.107550 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:24 crc kubenswrapper[4795]: I1205 09:32:24.881989 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerID="8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787" exitCode=0 Dec 05 09:32:24 crc kubenswrapper[4795]: I1205 09:32:24.882102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerDied","Data":"8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787"} Dec 05 09:32:24 crc kubenswrapper[4795]: I1205 09:32:24.882452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerStarted","Data":"920bb189aec2beef274df8d1f4ddc360473054144bd36455b6265c4f78b0619f"} Dec 05 09:32:25 crc kubenswrapper[4795]: I1205 09:32:25.898327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerStarted","Data":"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0"} Dec 05 09:32:27 crc kubenswrapper[4795]: I1205 09:32:27.747726 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:32:27 crc kubenswrapper[4795]: E1205 09:32:27.748441 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:32:29 crc kubenswrapper[4795]: I1205 09:32:29.950322 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerID="68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0" exitCode=0 Dec 05 09:32:29 crc kubenswrapper[4795]: I1205 09:32:29.950403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerDied","Data":"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0"} Dec 05 09:32:30 crc kubenswrapper[4795]: I1205 09:32:30.965765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerStarted","Data":"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a"} Dec 05 09:32:30 crc kubenswrapper[4795]: I1205 09:32:30.990825 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n94pj" podStartSLOduration=2.291364669 podStartE2EDuration="7.990802844s" podCreationTimestamp="2025-12-05 09:32:23 +0000 UTC" firstStartedPulling="2025-12-05 09:32:24.884085379 +0000 UTC m=+4096.456689118" lastFinishedPulling="2025-12-05 09:32:30.583523554 +0000 UTC m=+4102.156127293" observedRunningTime="2025-12-05 09:32:30.986764786 +0000 UTC m=+4102.559368525" watchObservedRunningTime="2025-12-05 09:32:30.990802844 +0000 UTC m=+4102.563406583" Dec 05 09:32:33 crc kubenswrapper[4795]: I1205 09:32:33.545573 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:33 crc kubenswrapper[4795]: I1205 09:32:33.546397 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:34 crc kubenswrapper[4795]: I1205 09:32:34.772634 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n94pj" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="registry-server" probeResult="failure" output=< Dec 05 09:32:34 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:32:34 crc kubenswrapper[4795]: > Dec 05 09:32:38 crc kubenswrapper[4795]: I1205 09:32:38.756829 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:32:38 crc kubenswrapper[4795]: E1205 09:32:38.758917 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:32:43 crc kubenswrapper[4795]: I1205 09:32:43.606365 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:43 crc kubenswrapper[4795]: I1205 09:32:43.667834 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:43 crc kubenswrapper[4795]: I1205 09:32:43.849808 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:45 crc kubenswrapper[4795]: I1205 09:32:45.126657 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n94pj" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="registry-server" containerID="cri-o://dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a" gracePeriod=2 Dec 05 09:32:45 crc kubenswrapper[4795]: I1205 09:32:45.911949 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.028101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content\") pod \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.028502 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities\") pod \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.028590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s4br\" (UniqueName: \"kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br\") pod \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\" (UID: \"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3\") " Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.029389 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities" (OuterVolumeSpecName: "utilities") pod "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" (UID: "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.044542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br" (OuterVolumeSpecName: "kube-api-access-2s4br") pod "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" (UID: "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3"). InnerVolumeSpecName "kube-api-access-2s4br". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.131895 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.132737 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s4br\" (UniqueName: \"kubernetes.io/projected/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-kube-api-access-2s4br\") on node \"crc\" DevicePath \"\"" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.145910 4795 generic.go:334] "Generic (PLEG): container finished" podID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerID="dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a" exitCode=0 Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.146133 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n94pj" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.146171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerDied","Data":"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a"} Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.146854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n94pj" event={"ID":"0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3","Type":"ContainerDied","Data":"920bb189aec2beef274df8d1f4ddc360473054144bd36455b6265c4f78b0619f"} Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.146938 4795 scope.go:117] "RemoveContainer" containerID="dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.151804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" (UID: "0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.182998 4795 scope.go:117] "RemoveContainer" containerID="68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.213707 4795 scope.go:117] "RemoveContainer" containerID="8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.235078 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.275057 4795 scope.go:117] "RemoveContainer" containerID="dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a" Dec 05 09:32:46 crc kubenswrapper[4795]: E1205 09:32:46.276559 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a\": container with ID starting with dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a not found: ID does not exist" containerID="dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.276606 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a"} err="failed to get container status \"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a\": rpc error: code = NotFound desc = could not find container \"dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a\": container with ID starting with dae8c309aeb38181531827aef4ac66d786a034927b5ab13ea2fb0b006599fd9a not found: ID does not exist" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.276720 4795 scope.go:117] "RemoveContainer" containerID="68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0" Dec 05 09:32:46 crc kubenswrapper[4795]: E1205 09:32:46.277120 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0\": container with ID starting with 68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0 not found: ID does not exist" containerID="68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.277153 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0"} err="failed to get container status \"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0\": rpc error: code = NotFound desc = could not find container \"68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0\": container with ID starting with 68c91f0e084e676599120b7515c6201dfcae48ff5ada0d364ac8f17913a8cfc0 not found: ID does not exist" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.277180 4795 scope.go:117] "RemoveContainer" containerID="8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787" Dec 05 09:32:46 crc kubenswrapper[4795]: E1205 09:32:46.277433 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787\": container with ID starting with 8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787 not found: ID does not exist" containerID="8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.277467 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787"} err="failed to get container status \"8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787\": rpc error: code = NotFound desc = could not find container \"8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787\": container with ID starting with 8915bea2836f60fd22b858f5d2b3c318760087f6ce56259ff0c5f6795747b787 not found: ID does not exist" Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.482623 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.491250 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n94pj"] Dec 05 09:32:46 crc kubenswrapper[4795]: I1205 09:32:46.767162 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" path="/var/lib/kubelet/pods/0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3/volumes" Dec 05 09:32:50 crc kubenswrapper[4795]: I1205 09:32:50.750259 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:32:50 crc kubenswrapper[4795]: E1205 09:32:50.754936 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:33:01 crc kubenswrapper[4795]: I1205 09:33:01.747936 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:33:01 crc kubenswrapper[4795]: E1205 09:33:01.754016 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:33:14 crc kubenswrapper[4795]: I1205 09:33:14.747874 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:33:15 crc kubenswrapper[4795]: I1205 09:33:15.460406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569"} Dec 05 09:35:40 crc kubenswrapper[4795]: I1205 09:35:40.827680 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:35:40 crc kubenswrapper[4795]: I1205 09:35:40.828492 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:36:10 crc kubenswrapper[4795]: I1205 09:36:10.826780 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:36:10 crc kubenswrapper[4795]: I1205 09:36:10.827533 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:36:14 crc kubenswrapper[4795]: I1205 09:36:14.697690 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:36:40 crc kubenswrapper[4795]: I1205 09:36:40.827250 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:36:40 crc kubenswrapper[4795]: I1205 09:36:40.828166 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:36:40 crc kubenswrapper[4795]: I1205 09:36:40.828247 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:36:40 crc kubenswrapper[4795]: I1205 09:36:40.829278 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:36:40 crc kubenswrapper[4795]: I1205 09:36:40.829344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569" gracePeriod=600 Dec 05 09:36:41 crc kubenswrapper[4795]: I1205 09:36:41.614273 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569" exitCode=0 Dec 05 09:36:41 crc kubenswrapper[4795]: I1205 09:36:41.615003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569"} Dec 05 09:36:41 crc kubenswrapper[4795]: I1205 09:36:41.615085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a"} Dec 05 09:36:41 crc kubenswrapper[4795]: I1205 09:36:41.615116 4795 scope.go:117] "RemoveContainer" containerID="bf2e238d2ff967b043728e5e461e5fa4591e875f365e4e5e0a8b7c961f9b16e7" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.526856 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:36:46 crc kubenswrapper[4795]: E1205 09:36:46.528048 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="extract-utilities" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.528066 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="extract-utilities" Dec 05 09:36:46 crc kubenswrapper[4795]: E1205 09:36:46.528088 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="registry-server" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.528094 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="registry-server" Dec 05 09:36:46 crc kubenswrapper[4795]: E1205 09:36:46.528104 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="extract-content" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.528115 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="extract-content" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.528356 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae065ab-ef74-49f5-aeaf-c4c41ddbb0a3" containerName="registry-server" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.531087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.548010 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.686117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.686797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.686833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc8rn\" (UniqueName: \"kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.788883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.789057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.789116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc8rn\" (UniqueName: \"kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.789666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.790157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.813539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc8rn\" (UniqueName: \"kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn\") pod \"certified-operators-nd6qq\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:46 crc kubenswrapper[4795]: I1205 09:36:46.851817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:48 crc kubenswrapper[4795]: I1205 09:36:48.892327 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:36:49 crc kubenswrapper[4795]: I1205 09:36:49.700413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerStarted","Data":"4c94758810fee0476e3371ab08aab14f770fa7b849a4dcc3fe4bf5dd96198751"} Dec 05 09:36:51 crc kubenswrapper[4795]: I1205 09:36:51.736638 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerID="b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d" exitCode=0 Dec 05 09:36:51 crc kubenswrapper[4795]: I1205 09:36:51.736879 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerDied","Data":"b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d"} Dec 05 09:36:51 crc kubenswrapper[4795]: I1205 09:36:51.741281 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:36:54 crc kubenswrapper[4795]: I1205 09:36:54.799376 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerID="8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1" exitCode=0 Dec 05 09:36:54 crc kubenswrapper[4795]: I1205 09:36:54.799446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerDied","Data":"8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1"} Dec 05 09:36:55 crc kubenswrapper[4795]: I1205 09:36:55.811660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerStarted","Data":"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66"} Dec 05 09:36:55 crc kubenswrapper[4795]: I1205 09:36:55.836475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nd6qq" podStartSLOduration=6.3319196699999996 podStartE2EDuration="9.836450358s" podCreationTimestamp="2025-12-05 09:36:46 +0000 UTC" firstStartedPulling="2025-12-05 09:36:51.74104314 +0000 UTC m=+4363.313646879" lastFinishedPulling="2025-12-05 09:36:55.245573838 +0000 UTC m=+4366.818177567" observedRunningTime="2025-12-05 09:36:55.834041534 +0000 UTC m=+4367.406645273" watchObservedRunningTime="2025-12-05 09:36:55.836450358 +0000 UTC m=+4367.409054097" Dec 05 09:36:56 crc kubenswrapper[4795]: I1205 09:36:56.852272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:56 crc kubenswrapper[4795]: I1205 09:36:56.852774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:36:57 crc kubenswrapper[4795]: I1205 09:36:57.911282 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nd6qq" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="registry-server" probeResult="failure" output=< Dec 05 09:36:57 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:36:57 crc kubenswrapper[4795]: > Dec 05 09:36:58 crc kubenswrapper[4795]: I1205 09:36:58.551841 4795 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v8qz8 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 05 09:36:58 crc kubenswrapper[4795]: I1205 09:36:58.552321 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v8qz8" podUID="62d33db5-212f-4884-b78b-159f06592142" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 05 09:37:06 crc kubenswrapper[4795]: I1205 09:37:06.908671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:37:06 crc kubenswrapper[4795]: I1205 09:37:06.988552 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:37:07 crc kubenswrapper[4795]: I1205 09:37:07.169486 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:37:07 crc kubenswrapper[4795]: I1205 09:37:07.944532 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nd6qq" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="registry-server" containerID="cri-o://d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66" gracePeriod=2 Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.646977 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.742699 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content\") pod \"1e5667a3-5462-4a7b-bdca-3d736a370650\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.742920 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities\") pod \"1e5667a3-5462-4a7b-bdca-3d736a370650\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.743044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc8rn\" (UniqueName: \"kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn\") pod \"1e5667a3-5462-4a7b-bdca-3d736a370650\" (UID: \"1e5667a3-5462-4a7b-bdca-3d736a370650\") " Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.743777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities" (OuterVolumeSpecName: "utilities") pod "1e5667a3-5462-4a7b-bdca-3d736a370650" (UID: "1e5667a3-5462-4a7b-bdca-3d736a370650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.744683 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.767023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn" (OuterVolumeSpecName: "kube-api-access-wc8rn") pod "1e5667a3-5462-4a7b-bdca-3d736a370650" (UID: "1e5667a3-5462-4a7b-bdca-3d736a370650"). InnerVolumeSpecName "kube-api-access-wc8rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.811993 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e5667a3-5462-4a7b-bdca-3d736a370650" (UID: "1e5667a3-5462-4a7b-bdca-3d736a370650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.847380 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc8rn\" (UniqueName: \"kubernetes.io/projected/1e5667a3-5462-4a7b-bdca-3d736a370650-kube-api-access-wc8rn\") on node \"crc\" DevicePath \"\"" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.847444 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5667a3-5462-4a7b-bdca-3d736a370650-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.957853 4795 generic.go:334] "Generic (PLEG): container finished" podID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerID="d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66" exitCode=0 Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.957920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerDied","Data":"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66"} Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.957963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nd6qq" event={"ID":"1e5667a3-5462-4a7b-bdca-3d736a370650","Type":"ContainerDied","Data":"4c94758810fee0476e3371ab08aab14f770fa7b849a4dcc3fe4bf5dd96198751"} Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.957987 4795 scope.go:117] "RemoveContainer" containerID="d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.957981 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nd6qq" Dec 05 09:37:08 crc kubenswrapper[4795]: I1205 09:37:08.996625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.008361 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nd6qq"] Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.010585 4795 scope.go:117] "RemoveContainer" containerID="8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.050179 4795 scope.go:117] "RemoveContainer" containerID="b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.097828 4795 scope.go:117] "RemoveContainer" containerID="d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66" Dec 05 09:37:09 crc kubenswrapper[4795]: E1205 09:37:09.098282 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66\": container with ID starting with d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66 not found: ID does not exist" containerID="d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.098351 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66"} err="failed to get container status \"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66\": rpc error: code = NotFound desc = could not find container \"d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66\": container with ID starting with d1bdd3ce690dd7e91ec9a45ff3579f9b197cb5f3077b8c65802d9591612cdb66 not found: ID does not exist" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.098387 4795 scope.go:117] "RemoveContainer" containerID="8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1" Dec 05 09:37:09 crc kubenswrapper[4795]: E1205 09:37:09.099127 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1\": container with ID starting with 8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1 not found: ID does not exist" containerID="8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.099168 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1"} err="failed to get container status \"8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1\": rpc error: code = NotFound desc = could not find container \"8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1\": container with ID starting with 8f0a83b63d562bf9427a7d8d787081c1956536629d51b909ed6b92326502f4c1 not found: ID does not exist" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.099207 4795 scope.go:117] "RemoveContainer" containerID="b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d" Dec 05 09:37:09 crc kubenswrapper[4795]: E1205 09:37:09.099499 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d\": container with ID starting with b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d not found: ID does not exist" containerID="b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d" Dec 05 09:37:09 crc kubenswrapper[4795]: I1205 09:37:09.099550 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d"} err="failed to get container status \"b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d\": rpc error: code = NotFound desc = could not find container \"b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d\": container with ID starting with b03618a1b411362dc77bf355da8d216b862badee5300f4cc17c9476de500b78d not found: ID does not exist" Dec 05 09:37:10 crc kubenswrapper[4795]: I1205 09:37:10.760751 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" path="/var/lib/kubelet/pods/1e5667a3-5462-4a7b-bdca-3d736a370650/volumes" Dec 05 09:39:10 crc kubenswrapper[4795]: I1205 09:39:10.827397 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:39:10 crc kubenswrapper[4795]: I1205 09:39:10.828246 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:39:40 crc kubenswrapper[4795]: I1205 09:39:40.826964 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:39:40 crc kubenswrapper[4795]: I1205 09:39:40.827574 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:40:10 crc kubenswrapper[4795]: I1205 09:40:10.826900 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:40:10 crc kubenswrapper[4795]: I1205 09:40:10.827684 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:40:10 crc kubenswrapper[4795]: I1205 09:40:10.827752 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:40:10 crc kubenswrapper[4795]: I1205 09:40:10.828725 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:40:10 crc kubenswrapper[4795]: I1205 09:40:10.828779 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" gracePeriod=600 Dec 05 09:40:10 crc kubenswrapper[4795]: E1205 09:40:10.958409 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:40:11 crc kubenswrapper[4795]: I1205 09:40:11.015077 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" exitCode=0 Dec 05 09:40:11 crc kubenswrapper[4795]: I1205 09:40:11.015162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a"} Dec 05 09:40:11 crc kubenswrapper[4795]: I1205 09:40:11.015280 4795 scope.go:117] "RemoveContainer" containerID="8f4d07fab9983dd66e900eb0435a6a61c1d7de6a476dd8bad70c40f1039da569" Dec 05 09:40:11 crc kubenswrapper[4795]: I1205 09:40:11.016314 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:40:11 crc kubenswrapper[4795]: E1205 09:40:11.016694 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:40:25 crc kubenswrapper[4795]: I1205 09:40:25.747943 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:40:25 crc kubenswrapper[4795]: E1205 09:40:25.749047 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:40:39 crc kubenswrapper[4795]: I1205 09:40:39.747710 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:40:39 crc kubenswrapper[4795]: E1205 09:40:39.748798 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.650014 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:40:51 crc kubenswrapper[4795]: E1205 09:40:51.656218 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="extract-utilities" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.656268 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="extract-utilities" Dec 05 09:40:51 crc kubenswrapper[4795]: E1205 09:40:51.656293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="extract-content" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.656304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="extract-content" Dec 05 09:40:51 crc kubenswrapper[4795]: E1205 09:40:51.656339 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="registry-server" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.656349 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="registry-server" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.656604 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5667a3-5462-4a7b-bdca-3d736a370650" containerName="registry-server" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.658384 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.680144 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.713886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vgsj\" (UniqueName: \"kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.714071 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.714106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.817647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vgsj\" (UniqueName: \"kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.817840 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.817878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.818752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.819261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.841893 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vgsj\" (UniqueName: \"kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj\") pod \"community-operators-wmwrt\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:51 crc kubenswrapper[4795]: I1205 09:40:51.989311 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:52.823807 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:53.481630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerStarted","Data":"cf5d77645d2a4af59b850d8ffb5c9b69f393d8a5328a4aee8a8e123f2166922f"} Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:53.699349 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:53.699349 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:54.494796 4795 generic.go:334] "Generic (PLEG): container finished" podID="e284833e-e525-450e-b653-1e244a69b4df" containerID="05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6" exitCode=0 Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:54.495149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerDied","Data":"05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6"} Dec 05 09:40:54 crc kubenswrapper[4795]: I1205 09:40:54.747353 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:40:54 crc kubenswrapper[4795]: E1205 09:40:54.747645 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:40:55 crc kubenswrapper[4795]: I1205 09:40:55.510759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerStarted","Data":"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3"} Dec 05 09:40:56 crc kubenswrapper[4795]: I1205 09:40:56.527721 4795 generic.go:334] "Generic (PLEG): container finished" podID="e284833e-e525-450e-b653-1e244a69b4df" containerID="21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3" exitCode=0 Dec 05 09:40:56 crc kubenswrapper[4795]: I1205 09:40:56.527796 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerDied","Data":"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3"} Dec 05 09:40:57 crc kubenswrapper[4795]: I1205 09:40:57.542165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerStarted","Data":"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c"} Dec 05 09:40:57 crc kubenswrapper[4795]: I1205 09:40:57.575206 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmwrt" podStartSLOduration=4.154031692 podStartE2EDuration="6.575179707s" podCreationTimestamp="2025-12-05 09:40:51 +0000 UTC" firstStartedPulling="2025-12-05 09:40:54.497514872 +0000 UTC m=+4606.070118611" lastFinishedPulling="2025-12-05 09:40:56.918662887 +0000 UTC m=+4608.491266626" observedRunningTime="2025-12-05 09:40:57.566288486 +0000 UTC m=+4609.138892225" watchObservedRunningTime="2025-12-05 09:40:57.575179707 +0000 UTC m=+4609.147783446" Dec 05 09:41:01 crc kubenswrapper[4795]: I1205 09:41:01.990825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:01 crc kubenswrapper[4795]: I1205 09:41:01.992489 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:02 crc kubenswrapper[4795]: I1205 09:41:02.047288 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:02 crc kubenswrapper[4795]: I1205 09:41:02.654766 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:02 crc kubenswrapper[4795]: I1205 09:41:02.727681 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:41:04 crc kubenswrapper[4795]: I1205 09:41:04.605693 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmwrt" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="registry-server" containerID="cri-o://51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c" gracePeriod=2 Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.132299 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.273287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vgsj\" (UniqueName: \"kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj\") pod \"e284833e-e525-450e-b653-1e244a69b4df\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.273714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content\") pod \"e284833e-e525-450e-b653-1e244a69b4df\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.273823 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities\") pod \"e284833e-e525-450e-b653-1e244a69b4df\" (UID: \"e284833e-e525-450e-b653-1e244a69b4df\") " Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.274996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities" (OuterVolumeSpecName: "utilities") pod "e284833e-e525-450e-b653-1e244a69b4df" (UID: "e284833e-e525-450e-b653-1e244a69b4df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.280354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj" (OuterVolumeSpecName: "kube-api-access-7vgsj") pod "e284833e-e525-450e-b653-1e244a69b4df" (UID: "e284833e-e525-450e-b653-1e244a69b4df"). InnerVolumeSpecName "kube-api-access-7vgsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.339568 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e284833e-e525-450e-b653-1e244a69b4df" (UID: "e284833e-e525-450e-b653-1e244a69b4df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.376525 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vgsj\" (UniqueName: \"kubernetes.io/projected/e284833e-e525-450e-b653-1e244a69b4df-kube-api-access-7vgsj\") on node \"crc\" DevicePath \"\"" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.376579 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.376594 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e284833e-e525-450e-b653-1e244a69b4df-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.625245 4795 generic.go:334] "Generic (PLEG): container finished" podID="e284833e-e525-450e-b653-1e244a69b4df" containerID="51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c" exitCode=0 Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.625314 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerDied","Data":"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c"} Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.625356 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmwrt" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.625411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmwrt" event={"ID":"e284833e-e525-450e-b653-1e244a69b4df","Type":"ContainerDied","Data":"cf5d77645d2a4af59b850d8ffb5c9b69f393d8a5328a4aee8a8e123f2166922f"} Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.625438 4795 scope.go:117] "RemoveContainer" containerID="51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.651525 4795 scope.go:117] "RemoveContainer" containerID="21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.675505 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.690366 4795 scope.go:117] "RemoveContainer" containerID="05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.694764 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmwrt"] Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.752299 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:41:05 crc kubenswrapper[4795]: E1205 09:41:05.752588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.754418 4795 scope.go:117] "RemoveContainer" containerID="51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c" Dec 05 09:41:05 crc kubenswrapper[4795]: E1205 09:41:05.757173 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c\": container with ID starting with 51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c not found: ID does not exist" containerID="51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.757205 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c"} err="failed to get container status \"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c\": rpc error: code = NotFound desc = could not find container \"51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c\": container with ID starting with 51bac17fd0363de0abfa04f61f42593a9b91c0fbb8c8bd981a6b5a5d32e4801c not found: ID does not exist" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.757238 4795 scope.go:117] "RemoveContainer" containerID="21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3" Dec 05 09:41:05 crc kubenswrapper[4795]: E1205 09:41:05.757637 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3\": container with ID starting with 21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3 not found: ID does not exist" containerID="21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.757655 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3"} err="failed to get container status \"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3\": rpc error: code = NotFound desc = could not find container \"21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3\": container with ID starting with 21fdffc4ea44a1ae50c91e2dcf936d5624bb0144b45f22755143852d970414b3 not found: ID does not exist" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.757669 4795 scope.go:117] "RemoveContainer" containerID="05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6" Dec 05 09:41:05 crc kubenswrapper[4795]: E1205 09:41:05.757871 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6\": container with ID starting with 05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6 not found: ID does not exist" containerID="05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6" Dec 05 09:41:05 crc kubenswrapper[4795]: I1205 09:41:05.757886 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6"} err="failed to get container status \"05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6\": rpc error: code = NotFound desc = could not find container \"05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6\": container with ID starting with 05d63cc94747d83e229dacff08d7932e35585b1d69a28d58bf550d36109880c6 not found: ID does not exist" Dec 05 09:41:06 crc kubenswrapper[4795]: I1205 09:41:06.760775 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e284833e-e525-450e-b653-1e244a69b4df" path="/var/lib/kubelet/pods/e284833e-e525-450e-b653-1e244a69b4df/volumes" Dec 05 09:41:18 crc kubenswrapper[4795]: I1205 09:41:18.747569 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:41:18 crc kubenswrapper[4795]: E1205 09:41:18.749383 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:41:30 crc kubenswrapper[4795]: I1205 09:41:30.747233 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:41:30 crc kubenswrapper[4795]: E1205 09:41:30.748102 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:41:44 crc kubenswrapper[4795]: I1205 09:41:44.752064 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:41:44 crc kubenswrapper[4795]: E1205 09:41:44.752846 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:41:57 crc kubenswrapper[4795]: I1205 09:41:57.748098 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:41:57 crc kubenswrapper[4795]: E1205 09:41:57.748934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:42:09 crc kubenswrapper[4795]: I1205 09:42:09.747693 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:42:09 crc kubenswrapper[4795]: E1205 09:42:09.748787 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.258690 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:10 crc kubenswrapper[4795]: E1205 09:42:10.259560 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="extract-content" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.259586 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="extract-content" Dec 05 09:42:10 crc kubenswrapper[4795]: E1205 09:42:10.259637 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="registry-server" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.259648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="registry-server" Dec 05 09:42:10 crc kubenswrapper[4795]: E1205 09:42:10.259670 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="extract-utilities" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.259679 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="extract-utilities" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.259902 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e284833e-e525-450e-b653-1e244a69b4df" containerName="registry-server" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.261543 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.277297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.403807 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4b4h\" (UniqueName: \"kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.404064 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.404114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.506031 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.506107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.506146 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4b4h\" (UniqueName: \"kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.506801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.506817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.536240 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4b4h\" (UniqueName: \"kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h\") pod \"redhat-marketplace-7xjgf\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:10 crc kubenswrapper[4795]: I1205 09:42:10.602982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:11 crc kubenswrapper[4795]: I1205 09:42:11.217754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:11 crc kubenswrapper[4795]: I1205 09:42:11.454746 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerStarted","Data":"bc52d8bc9698e7089ec725b556c971b83f27ad964e11a1fdbb2a5a52495d1d0b"} Dec 05 09:42:11 crc kubenswrapper[4795]: I1205 09:42:11.455120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerStarted","Data":"c42e51a615bc7f55b1f27c0c5cbde3425229d9d759cd162b26cfa9023e8bdaa2"} Dec 05 09:42:12 crc kubenswrapper[4795]: I1205 09:42:12.470181 4795 generic.go:334] "Generic (PLEG): container finished" podID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerID="bc52d8bc9698e7089ec725b556c971b83f27ad964e11a1fdbb2a5a52495d1d0b" exitCode=0 Dec 05 09:42:12 crc kubenswrapper[4795]: I1205 09:42:12.470279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerDied","Data":"bc52d8bc9698e7089ec725b556c971b83f27ad964e11a1fdbb2a5a52495d1d0b"} Dec 05 09:42:12 crc kubenswrapper[4795]: I1205 09:42:12.474464 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:42:14 crc kubenswrapper[4795]: I1205 09:42:14.494672 4795 generic.go:334] "Generic (PLEG): container finished" podID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerID="af76297a7f7ecb38ba43cbc73ab2fa35cc57943d4e5077ca3567e85a33e094a3" exitCode=0 Dec 05 09:42:14 crc kubenswrapper[4795]: I1205 09:42:14.494742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerDied","Data":"af76297a7f7ecb38ba43cbc73ab2fa35cc57943d4e5077ca3567e85a33e094a3"} Dec 05 09:42:18 crc kubenswrapper[4795]: I1205 09:42:18.670886 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-mbkw8" podUID="b41e588d-948f-4709-8717-cfbe8fbba4c9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.82:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 05 09:42:20 crc kubenswrapper[4795]: I1205 09:42:20.750477 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:42:20 crc kubenswrapper[4795]: E1205 09:42:20.751013 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:42:22 crc kubenswrapper[4795]: I1205 09:42:22.576651 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerStarted","Data":"1f78f8808b364c1620e3ae72c35428bf42b3fc8826b4cf9f8196a60ba4b556c1"} Dec 05 09:42:23 crc kubenswrapper[4795]: I1205 09:42:23.617805 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7xjgf" podStartSLOduration=4.058287781 podStartE2EDuration="13.617773996s" podCreationTimestamp="2025-12-05 09:42:10 +0000 UTC" firstStartedPulling="2025-12-05 09:42:12.474076789 +0000 UTC m=+4684.046680528" lastFinishedPulling="2025-12-05 09:42:22.033563004 +0000 UTC m=+4693.606166743" observedRunningTime="2025-12-05 09:42:23.612517413 +0000 UTC m=+4695.185121142" watchObservedRunningTime="2025-12-05 09:42:23.617773996 +0000 UTC m=+4695.190377735" Dec 05 09:42:30 crc kubenswrapper[4795]: I1205 09:42:30.604152 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:30 crc kubenswrapper[4795]: I1205 09:42:30.604869 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:30 crc kubenswrapper[4795]: I1205 09:42:30.655847 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:31 crc kubenswrapper[4795]: I1205 09:42:31.381963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:31 crc kubenswrapper[4795]: I1205 09:42:31.467133 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:33 crc kubenswrapper[4795]: I1205 09:42:33.343733 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7xjgf" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="registry-server" containerID="cri-o://1f78f8808b364c1620e3ae72c35428bf42b3fc8826b4cf9f8196a60ba4b556c1" gracePeriod=2 Dec 05 09:42:33 crc kubenswrapper[4795]: I1205 09:42:33.750417 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:42:33 crc kubenswrapper[4795]: E1205 09:42:33.751224 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.424080 4795 generic.go:334] "Generic (PLEG): container finished" podID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerID="1f78f8808b364c1620e3ae72c35428bf42b3fc8826b4cf9f8196a60ba4b556c1" exitCode=0 Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.424189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerDied","Data":"1f78f8808b364c1620e3ae72c35428bf42b3fc8826b4cf9f8196a60ba4b556c1"} Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.424230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7xjgf" event={"ID":"9fb61e0e-eb36-461d-bb4f-afc95453f8a5","Type":"ContainerDied","Data":"c42e51a615bc7f55b1f27c0c5cbde3425229d9d759cd162b26cfa9023e8bdaa2"} Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.424268 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c42e51a615bc7f55b1f27c0c5cbde3425229d9d759cd162b26cfa9023e8bdaa2" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.463480 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.559593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities\") pod \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.559754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4b4h\" (UniqueName: \"kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h\") pod \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.559925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content\") pod \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\" (UID: \"9fb61e0e-eb36-461d-bb4f-afc95453f8a5\") " Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.561238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities" (OuterVolumeSpecName: "utilities") pod "9fb61e0e-eb36-461d-bb4f-afc95453f8a5" (UID: "9fb61e0e-eb36-461d-bb4f-afc95453f8a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.567220 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h" (OuterVolumeSpecName: "kube-api-access-r4b4h") pod "9fb61e0e-eb36-461d-bb4f-afc95453f8a5" (UID: "9fb61e0e-eb36-461d-bb4f-afc95453f8a5"). InnerVolumeSpecName "kube-api-access-r4b4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.586706 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fb61e0e-eb36-461d-bb4f-afc95453f8a5" (UID: "9fb61e0e-eb36-461d-bb4f-afc95453f8a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.662943 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.662984 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:42:34 crc kubenswrapper[4795]: I1205 09:42:34.662997 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4b4h\" (UniqueName: \"kubernetes.io/projected/9fb61e0e-eb36-461d-bb4f-afc95453f8a5-kube-api-access-r4b4h\") on node \"crc\" DevicePath \"\"" Dec 05 09:42:35 crc kubenswrapper[4795]: I1205 09:42:35.434746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7xjgf" Dec 05 09:42:35 crc kubenswrapper[4795]: I1205 09:42:35.464524 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:35 crc kubenswrapper[4795]: I1205 09:42:35.476445 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7xjgf"] Dec 05 09:42:36 crc kubenswrapper[4795]: I1205 09:42:36.759295 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" path="/var/lib/kubelet/pods/9fb61e0e-eb36-461d-bb4f-afc95453f8a5/volumes" Dec 05 09:42:44 crc kubenswrapper[4795]: I1205 09:42:44.748247 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:42:44 crc kubenswrapper[4795]: E1205 09:42:44.749033 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:42:57 crc kubenswrapper[4795]: I1205 09:42:57.747814 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:42:57 crc kubenswrapper[4795]: E1205 09:42:57.748700 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:43:08 crc kubenswrapper[4795]: I1205 09:43:08.756492 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:43:08 crc kubenswrapper[4795]: E1205 09:43:08.757944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:43:21 crc kubenswrapper[4795]: I1205 09:43:21.747549 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:43:21 crc kubenswrapper[4795]: E1205 09:43:21.748527 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:43:36 crc kubenswrapper[4795]: I1205 09:43:36.747444 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:43:36 crc kubenswrapper[4795]: E1205 09:43:36.749708 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:43:47 crc kubenswrapper[4795]: I1205 09:43:47.747777 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:43:47 crc kubenswrapper[4795]: E1205 09:43:47.748681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:43:59 crc kubenswrapper[4795]: I1205 09:43:59.748266 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:43:59 crc kubenswrapper[4795]: E1205 09:43:59.749143 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:44:12 crc kubenswrapper[4795]: I1205 09:44:12.748470 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:44:12 crc kubenswrapper[4795]: E1205 09:44:12.749972 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:44:27 crc kubenswrapper[4795]: I1205 09:44:27.747925 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:44:27 crc kubenswrapper[4795]: E1205 09:44:27.748788 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:44:42 crc kubenswrapper[4795]: I1205 09:44:42.747343 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:44:42 crc kubenswrapper[4795]: E1205 09:44:42.748441 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:44:56 crc kubenswrapper[4795]: I1205 09:44:56.748114 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:44:56 crc kubenswrapper[4795]: E1205 09:44:56.749079 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.158530 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4"] Dec 05 09:45:00 crc kubenswrapper[4795]: E1205 09:45:00.160115 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="extract-utilities" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.160136 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="extract-utilities" Dec 05 09:45:00 crc kubenswrapper[4795]: E1205 09:45:00.160186 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="registry-server" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.160195 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="registry-server" Dec 05 09:45:00 crc kubenswrapper[4795]: E1205 09:45:00.160208 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="extract-content" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.160215 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="extract-content" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.160485 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb61e0e-eb36-461d-bb4f-afc95453f8a5" containerName="registry-server" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.162031 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.167925 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.177061 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4"] Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.179143 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.332662 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5krnb\" (UniqueName: \"kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.332776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.332823 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.434437 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.434525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.434639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5krnb\" (UniqueName: \"kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.435828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.456458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.457128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5krnb\" (UniqueName: \"kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb\") pod \"collect-profiles-29415465-8qrk4\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:00 crc kubenswrapper[4795]: I1205 09:45:00.495806 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:01 crc kubenswrapper[4795]: I1205 09:45:01.036880 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4"] Dec 05 09:45:01 crc kubenswrapper[4795]: I1205 09:45:01.973570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" event={"ID":"0da64661-27f1-40d0-8482-cf02ef95be9a","Type":"ContainerStarted","Data":"c4a3e6a38c481094bc29361595264167ffe36a6401359728cc3adb358cb73533"} Dec 05 09:45:02 crc kubenswrapper[4795]: I1205 09:45:02.986219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" event={"ID":"0da64661-27f1-40d0-8482-cf02ef95be9a","Type":"ContainerStarted","Data":"d69d61d3fc91cd4285c7ee0d30a457c80ab547abc8c98d8bab23b8e840ade092"} Dec 05 09:45:03 crc kubenswrapper[4795]: I1205 09:45:03.998507 4795 generic.go:334] "Generic (PLEG): container finished" podID="0da64661-27f1-40d0-8482-cf02ef95be9a" containerID="d69d61d3fc91cd4285c7ee0d30a457c80ab547abc8c98d8bab23b8e840ade092" exitCode=0 Dec 05 09:45:03 crc kubenswrapper[4795]: I1205 09:45:03.998601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" event={"ID":"0da64661-27f1-40d0-8482-cf02ef95be9a","Type":"ContainerDied","Data":"d69d61d3fc91cd4285c7ee0d30a457c80ab547abc8c98d8bab23b8e840ade092"} Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.481708 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.651251 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5krnb\" (UniqueName: \"kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb\") pod \"0da64661-27f1-40d0-8482-cf02ef95be9a\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.651611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume\") pod \"0da64661-27f1-40d0-8482-cf02ef95be9a\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.651713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume\") pod \"0da64661-27f1-40d0-8482-cf02ef95be9a\" (UID: \"0da64661-27f1-40d0-8482-cf02ef95be9a\") " Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.654161 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume" (OuterVolumeSpecName: "config-volume") pod "0da64661-27f1-40d0-8482-cf02ef95be9a" (UID: "0da64661-27f1-40d0-8482-cf02ef95be9a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.664884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0da64661-27f1-40d0-8482-cf02ef95be9a" (UID: "0da64661-27f1-40d0-8482-cf02ef95be9a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.668185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb" (OuterVolumeSpecName: "kube-api-access-5krnb") pod "0da64661-27f1-40d0-8482-cf02ef95be9a" (UID: "0da64661-27f1-40d0-8482-cf02ef95be9a"). InnerVolumeSpecName "kube-api-access-5krnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.754505 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5krnb\" (UniqueName: \"kubernetes.io/projected/0da64661-27f1-40d0-8482-cf02ef95be9a-kube-api-access-5krnb\") on node \"crc\" DevicePath \"\"" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.754556 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0da64661-27f1-40d0-8482-cf02ef95be9a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:45:05 crc kubenswrapper[4795]: I1205 09:45:05.754569 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0da64661-27f1-40d0-8482-cf02ef95be9a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.020810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" event={"ID":"0da64661-27f1-40d0-8482-cf02ef95be9a","Type":"ContainerDied","Data":"c4a3e6a38c481094bc29361595264167ffe36a6401359728cc3adb358cb73533"} Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.021096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415465-8qrk4" Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.021225 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4a3e6a38c481094bc29361595264167ffe36a6401359728cc3adb358cb73533" Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.619892 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5"] Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.630905 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415420-h87z5"] Dec 05 09:45:06 crc kubenswrapper[4795]: I1205 09:45:06.764114 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b81a91e2-dcb4-4743-8fa5-836b060e27f1" path="/var/lib/kubelet/pods/b81a91e2-dcb4-4743-8fa5-836b060e27f1/volumes" Dec 05 09:45:08 crc kubenswrapper[4795]: I1205 09:45:08.756788 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:45:08 crc kubenswrapper[4795]: E1205 09:45:08.757296 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:45:23 crc kubenswrapper[4795]: I1205 09:45:23.748277 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:45:25 crc kubenswrapper[4795]: I1205 09:45:25.223018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3"} Dec 05 09:45:34 crc kubenswrapper[4795]: I1205 09:45:34.691335 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:45:34 crc kubenswrapper[4795]: I1205 09:45:34.691389 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:45:54 crc kubenswrapper[4795]: I1205 09:45:54.670142 4795 scope.go:117] "RemoveContainer" containerID="3042ebd2620f8e1e4debe629d6bd4b7aacbfd04de32e88efa080ad2eb811ccc8" Dec 05 09:47:24 crc kubenswrapper[4795]: I1205 09:47:24.068980 4795 patch_prober.go:28] interesting pod/console-57cf76c794-fv9fk container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/health\": context deadline exceeded" start-of-body= Dec 05 09:47:24 crc kubenswrapper[4795]: I1205 09:47:24.069572 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-57cf76c794-fv9fk" podUID="d5b15de2-618a-4aad-886e-7ba7ba43307e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.35:8443/health\": context deadline exceeded" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.352101 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:47:40 crc kubenswrapper[4795]: E1205 09:47:40.353231 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da64661-27f1-40d0-8482-cf02ef95be9a" containerName="collect-profiles" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.353248 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da64661-27f1-40d0-8482-cf02ef95be9a" containerName="collect-profiles" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.353494 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da64661-27f1-40d0-8482-cf02ef95be9a" containerName="collect-profiles" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.355151 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.372798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.522110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.522568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5wb7\" (UniqueName: \"kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.522747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.541093 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.543784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.553276 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625067 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5wb7\" (UniqueName: \"kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625369 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndmgg\" (UniqueName: \"kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.625865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.646313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5wb7\" (UniqueName: \"kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7\") pod \"certified-operators-4hs9g\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.690939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.727585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.727916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.727992 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndmgg\" (UniqueName: \"kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.728737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.728858 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.752574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndmgg\" (UniqueName: \"kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg\") pod \"redhat-operators-q6ctw\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.831606 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.831929 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:47:40 crc kubenswrapper[4795]: I1205 09:47:40.878983 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:47:41 crc kubenswrapper[4795]: I1205 09:47:41.326243 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:47:41 crc kubenswrapper[4795]: I1205 09:47:41.569525 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:47:41 crc kubenswrapper[4795]: W1205 09:47:41.573659 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec5a755d_d7a9_472e_b391_6d38a0a1b2bc.slice/crio-406438abeda17b87020c089523b34440956d74d00e8cd1c9bbb9eaee6676a0be WatchSource:0}: Error finding container 406438abeda17b87020c089523b34440956d74d00e8cd1c9bbb9eaee6676a0be: Status 404 returned error can't find the container with id 406438abeda17b87020c089523b34440956d74d00e8cd1c9bbb9eaee6676a0be Dec 05 09:47:42 crc kubenswrapper[4795]: I1205 09:47:42.082073 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerStarted","Data":"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381"} Dec 05 09:47:42 crc kubenswrapper[4795]: I1205 09:47:42.082573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerStarted","Data":"406438abeda17b87020c089523b34440956d74d00e8cd1c9bbb9eaee6676a0be"} Dec 05 09:47:42 crc kubenswrapper[4795]: I1205 09:47:42.088904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerStarted","Data":"1ea90c7c7396d8dd1d380f38043df8585398c6056fa40f15357b5c951e8a52d4"} Dec 05 09:47:42 crc kubenswrapper[4795]: I1205 09:47:42.088969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerStarted","Data":"72f5b80b0ab85e328f68e994b637a7c60fc0cf83859bfea65dc997ed27190179"} Dec 05 09:47:42 crc kubenswrapper[4795]: I1205 09:47:42.093753 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:47:43 crc kubenswrapper[4795]: I1205 09:47:43.099797 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerID="87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381" exitCode=0 Dec 05 09:47:43 crc kubenswrapper[4795]: I1205 09:47:43.099908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerDied","Data":"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381"} Dec 05 09:47:43 crc kubenswrapper[4795]: I1205 09:47:43.103557 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerID="1ea90c7c7396d8dd1d380f38043df8585398c6056fa40f15357b5c951e8a52d4" exitCode=0 Dec 05 09:47:43 crc kubenswrapper[4795]: I1205 09:47:43.103744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerDied","Data":"1ea90c7c7396d8dd1d380f38043df8585398c6056fa40f15357b5c951e8a52d4"} Dec 05 09:47:45 crc kubenswrapper[4795]: I1205 09:47:45.137066 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerStarted","Data":"7806c51f83d904921c3486df682bb1528a2ea22a4dbad885269072af97715ee6"} Dec 05 09:47:46 crc kubenswrapper[4795]: I1205 09:47:46.149344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerStarted","Data":"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69"} Dec 05 09:47:46 crc kubenswrapper[4795]: I1205 09:47:46.153276 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerID="7806c51f83d904921c3486df682bb1528a2ea22a4dbad885269072af97715ee6" exitCode=0 Dec 05 09:47:46 crc kubenswrapper[4795]: I1205 09:47:46.153343 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerDied","Data":"7806c51f83d904921c3486df682bb1528a2ea22a4dbad885269072af97715ee6"} Dec 05 09:47:59 crc kubenswrapper[4795]: I1205 09:47:59.293593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerStarted","Data":"5637979f66cdd2b7f45d517cce0234b0828e2fba7a820f231a0bfadaabeb48d0"} Dec 05 09:47:59 crc kubenswrapper[4795]: I1205 09:47:59.321662 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hs9g" podStartSLOduration=2.924659456 podStartE2EDuration="19.321641639s" podCreationTimestamp="2025-12-05 09:47:40 +0000 UTC" firstStartedPulling="2025-12-05 09:47:42.09131832 +0000 UTC m=+5013.663922059" lastFinishedPulling="2025-12-05 09:47:58.488300503 +0000 UTC m=+5030.060904242" observedRunningTime="2025-12-05 09:47:59.314817234 +0000 UTC m=+5030.887420973" watchObservedRunningTime="2025-12-05 09:47:59.321641639 +0000 UTC m=+5030.894245388" Dec 05 09:48:00 crc kubenswrapper[4795]: I1205 09:48:00.304917 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerID="c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69" exitCode=0 Dec 05 09:48:00 crc kubenswrapper[4795]: I1205 09:48:00.306411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerDied","Data":"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69"} Dec 05 09:48:00 crc kubenswrapper[4795]: I1205 09:48:00.691453 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:00 crc kubenswrapper[4795]: I1205 09:48:00.691844 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:01 crc kubenswrapper[4795]: I1205 09:48:01.318659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerStarted","Data":"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7"} Dec 05 09:48:01 crc kubenswrapper[4795]: I1205 09:48:01.748280 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-4hs9g" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="registry-server" probeResult="failure" output=< Dec 05 09:48:01 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:48:01 crc kubenswrapper[4795]: > Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.759183 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.786204 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q6ctw" podStartSLOduration=13.683252009 podStartE2EDuration="30.786181278s" podCreationTimestamp="2025-12-05 09:47:40 +0000 UTC" firstStartedPulling="2025-12-05 09:47:43.691881367 +0000 UTC m=+5015.264485106" lastFinishedPulling="2025-12-05 09:48:00.794810626 +0000 UTC m=+5032.367414375" observedRunningTime="2025-12-05 09:48:01.343373327 +0000 UTC m=+5032.915977066" watchObservedRunningTime="2025-12-05 09:48:10.786181278 +0000 UTC m=+5042.358785027" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.814006 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.826816 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.826885 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.881354 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.881528 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:10 crc kubenswrapper[4795]: I1205 09:48:10.936895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:11 crc kubenswrapper[4795]: I1205 09:48:11.464910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:11 crc kubenswrapper[4795]: I1205 09:48:11.942761 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:48:12 crc kubenswrapper[4795]: I1205 09:48:12.423795 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hs9g" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="registry-server" containerID="cri-o://5637979f66cdd2b7f45d517cce0234b0828e2fba7a820f231a0bfadaabeb48d0" gracePeriod=2 Dec 05 09:48:13 crc kubenswrapper[4795]: I1205 09:48:13.343968 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:48:13 crc kubenswrapper[4795]: I1205 09:48:13.437409 4795 generic.go:334] "Generic (PLEG): container finished" podID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerID="5637979f66cdd2b7f45d517cce0234b0828e2fba7a820f231a0bfadaabeb48d0" exitCode=0 Dec 05 09:48:13 crc kubenswrapper[4795]: I1205 09:48:13.437506 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerDied","Data":"5637979f66cdd2b7f45d517cce0234b0828e2fba7a820f231a0bfadaabeb48d0"} Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.042659 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.102842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities\") pod \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.102901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content\") pod \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.103083 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5wb7\" (UniqueName: \"kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7\") pod \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\" (UID: \"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.103515 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities" (OuterVolumeSpecName: "utilities") pod "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" (UID: "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.103662 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.110275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7" (OuterVolumeSpecName: "kube-api-access-l5wb7") pod "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" (UID: "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3"). InnerVolumeSpecName "kube-api-access-l5wb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.165596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" (UID: "8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.205869 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5wb7\" (UniqueName: \"kubernetes.io/projected/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-kube-api-access-l5wb7\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.205913 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.450753 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hs9g" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.450724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hs9g" event={"ID":"8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3","Type":"ContainerDied","Data":"72f5b80b0ab85e328f68e994b637a7c60fc0cf83859bfea65dc997ed27190179"} Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.450842 4795 scope.go:117] "RemoveContainer" containerID="5637979f66cdd2b7f45d517cce0234b0828e2fba7a820f231a0bfadaabeb48d0" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.450917 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q6ctw" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="registry-server" containerID="cri-o://a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7" gracePeriod=2 Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.481600 4795 scope.go:117] "RemoveContainer" containerID="7806c51f83d904921c3486df682bb1528a2ea22a4dbad885269072af97715ee6" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.511051 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.524284 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hs9g"] Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.533467 4795 scope.go:117] "RemoveContainer" containerID="1ea90c7c7396d8dd1d380f38043df8585398c6056fa40f15357b5c951e8a52d4" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.765962 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" path="/var/lib/kubelet/pods/8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3/volumes" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.855677 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.923089 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndmgg\" (UniqueName: \"kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg\") pod \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.923403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities\") pod \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.923779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content\") pod \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\" (UID: \"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc\") " Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.924345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities" (OuterVolumeSpecName: "utilities") pod "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" (UID: "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.924754 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:14 crc kubenswrapper[4795]: I1205 09:48:14.933308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg" (OuterVolumeSpecName: "kube-api-access-ndmgg") pod "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" (UID: "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc"). InnerVolumeSpecName "kube-api-access-ndmgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.027241 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndmgg\" (UniqueName: \"kubernetes.io/projected/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-kube-api-access-ndmgg\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.073056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" (UID: "ec5a755d-d7a9-472e-b391-6d38a0a1b2bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.129587 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.465415 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerID="a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7" exitCode=0 Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.465639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerDied","Data":"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7"} Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.465698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6ctw" event={"ID":"ec5a755d-d7a9-472e-b391-6d38a0a1b2bc","Type":"ContainerDied","Data":"406438abeda17b87020c089523b34440956d74d00e8cd1c9bbb9eaee6676a0be"} Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.465729 4795 scope.go:117] "RemoveContainer" containerID="a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.465909 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6ctw" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.501447 4795 scope.go:117] "RemoveContainer" containerID="c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.521488 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.535393 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q6ctw"] Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.580808 4795 scope.go:117] "RemoveContainer" containerID="87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.608670 4795 scope.go:117] "RemoveContainer" containerID="a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7" Dec 05 09:48:15 crc kubenswrapper[4795]: E1205 09:48:15.609373 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7\": container with ID starting with a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7 not found: ID does not exist" containerID="a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.609429 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7"} err="failed to get container status \"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7\": rpc error: code = NotFound desc = could not find container \"a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7\": container with ID starting with a94e9066bca4b997db9ab9f673e0c85bb752c8b0eafb2c8c26b95684d848bca7 not found: ID does not exist" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.609475 4795 scope.go:117] "RemoveContainer" containerID="c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69" Dec 05 09:48:15 crc kubenswrapper[4795]: E1205 09:48:15.610375 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69\": container with ID starting with c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69 not found: ID does not exist" containerID="c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.610460 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69"} err="failed to get container status \"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69\": rpc error: code = NotFound desc = could not find container \"c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69\": container with ID starting with c2967400f27a46ffe15cc631e0e744e3ea4fb76026bf5c489fabad2b2e13ee69 not found: ID does not exist" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.610501 4795 scope.go:117] "RemoveContainer" containerID="87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381" Dec 05 09:48:15 crc kubenswrapper[4795]: E1205 09:48:15.611013 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381\": container with ID starting with 87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381 not found: ID does not exist" containerID="87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381" Dec 05 09:48:15 crc kubenswrapper[4795]: I1205 09:48:15.611087 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381"} err="failed to get container status \"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381\": rpc error: code = NotFound desc = could not find container \"87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381\": container with ID starting with 87bd3f97f58892c8cf8a32b41bcdda72f04477953805b972033662310d50a381 not found: ID does not exist" Dec 05 09:48:16 crc kubenswrapper[4795]: I1205 09:48:16.761695 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" path="/var/lib/kubelet/pods/ec5a755d-d7a9-472e-b391-6d38a0a1b2bc/volumes" Dec 05 09:48:19 crc kubenswrapper[4795]: I1205 09:48:19.523047 4795 generic.go:334] "Generic (PLEG): container finished" podID="7f223c39-817f-4bac-9c3b-490359a0e44d" containerID="6853b0e9af0634724323ea374ccf54d5da7cfca3f842ddce91b07a12efa15064" exitCode=1 Dec 05 09:48:19 crc kubenswrapper[4795]: I1205 09:48:19.523142 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7f223c39-817f-4bac-9c3b-490359a0e44d","Type":"ContainerDied","Data":"6853b0e9af0634724323ea374ccf54d5da7cfca3f842ddce91b07a12efa15064"} Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.325916 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.479370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dh4w\" (UniqueName: \"kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.479425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.479541 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.479570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.479958 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.480022 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.480161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.480236 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.480257 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary\") pod \"7f223c39-817f-4bac-9c3b-490359a0e44d\" (UID: \"7f223c39-817f-4bac-9c3b-490359a0e44d\") " Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.486013 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.493133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data" (OuterVolumeSpecName: "config-data") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.493486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w" (OuterVolumeSpecName: "kube-api-access-5dh4w") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "kube-api-access-5dh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.497002 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.502468 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.545574 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"7f223c39-817f-4bac-9c3b-490359a0e44d","Type":"ContainerDied","Data":"29047ab2fc5889f5dd0ced6ddb0192201e9f995cdd002ea836bf75cd62b82a38"} Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.545653 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29047ab2fc5889f5dd0ced6ddb0192201e9f995cdd002ea836bf75cd62b82a38" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.545659 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.584529 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.584575 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.584589 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dh4w\" (UniqueName: \"kubernetes.io/projected/7f223c39-817f-4bac-9c3b-490359a0e44d-kube-api-access-5dh4w\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:21 crc kubenswrapper[4795]: I1205 09:48:21.584599 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/7f223c39-817f-4bac-9c3b-490359a0e44d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.459097 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.460891 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.471072 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.474517 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.476547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7f223c39-817f-4bac-9c3b-490359a0e44d" (UID: "7f223c39-817f-4bac-9c3b-490359a0e44d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.508140 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.561190 4795 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.561239 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.561254 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f223c39-817f-4bac-9c3b-490359a0e44d-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.561267 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7f223c39-817f-4bac-9c3b-490359a0e44d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:26 crc kubenswrapper[4795]: I1205 09:48:26.762088 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.187638 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189151 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="extract-utilities" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189171 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="extract-utilities" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189191 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189206 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189212 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189227 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f223c39-817f-4bac-9c3b-490359a0e44d" containerName="tempest-tests-tempest-tests-runner" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189235 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f223c39-817f-4bac-9c3b-490359a0e44d" containerName="tempest-tests-tempest-tests-runner" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189249 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="extract-content" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189257 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="extract-content" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189271 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="extract-content" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189278 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="extract-content" Dec 05 09:48:31 crc kubenswrapper[4795]: E1205 09:48:31.189285 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="extract-utilities" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189292 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="extract-utilities" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e03a5ae-de55-47ad-95b7-8ccf3aaa55a3" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189502 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5a755d-d7a9-472e-b391-6d38a0a1b2bc" containerName="registry-server" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.189512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f223c39-817f-4bac-9c3b-490359a0e44d" containerName="tempest-tests-tempest-tests-runner" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.190279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.196150 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-48rcz" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.202964 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.357468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjbm5\" (UniqueName: \"kubernetes.io/projected/e7982c9e-f3b2-416c-bafa-c089c65e882c-kube-api-access-mjbm5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.357565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.459433 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjbm5\" (UniqueName: \"kubernetes.io/projected/e7982c9e-f3b2-416c-bafa-c089c65e882c-kube-api-access-mjbm5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.459546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.460809 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.495476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjbm5\" (UniqueName: \"kubernetes.io/projected/e7982c9e-f3b2-416c-bafa-c089c65e882c-kube-api-access-mjbm5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.501814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e7982c9e-f3b2-416c-bafa-c089c65e882c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:31 crc kubenswrapper[4795]: I1205 09:48:31.526142 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 05 09:48:32 crc kubenswrapper[4795]: I1205 09:48:32.123170 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 05 09:48:32 crc kubenswrapper[4795]: I1205 09:48:32.705657 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e7982c9e-f3b2-416c-bafa-c089c65e882c","Type":"ContainerStarted","Data":"cdb29b541b179229e43e030be0a1befaa49d8353201d142ae7e330332c5e3046"} Dec 05 09:48:34 crc kubenswrapper[4795]: I1205 09:48:34.729903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e7982c9e-f3b2-416c-bafa-c089c65e882c","Type":"ContainerStarted","Data":"18b777f586e089b9e57410dfe0956a61c03d8eb4f38add0933457d9da1b1f033"} Dec 05 09:48:34 crc kubenswrapper[4795]: I1205 09:48:34.747058 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.326498168 podStartE2EDuration="3.747034624s" podCreationTimestamp="2025-12-05 09:48:31 +0000 UTC" firstStartedPulling="2025-12-05 09:48:32.135239629 +0000 UTC m=+5063.707843368" lastFinishedPulling="2025-12-05 09:48:33.555776085 +0000 UTC m=+5065.128379824" observedRunningTime="2025-12-05 09:48:34.746557901 +0000 UTC m=+5066.319161650" watchObservedRunningTime="2025-12-05 09:48:34.747034624 +0000 UTC m=+5066.319638363" Dec 05 09:48:40 crc kubenswrapper[4795]: I1205 09:48:40.826920 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:48:40 crc kubenswrapper[4795]: I1205 09:48:40.827484 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:48:40 crc kubenswrapper[4795]: I1205 09:48:40.827545 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:48:40 crc kubenswrapper[4795]: I1205 09:48:40.828480 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:48:40 crc kubenswrapper[4795]: I1205 09:48:40.828534 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3" gracePeriod=600 Dec 05 09:48:41 crc kubenswrapper[4795]: I1205 09:48:41.800826 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3" exitCode=0 Dec 05 09:48:41 crc kubenswrapper[4795]: I1205 09:48:41.801263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3"} Dec 05 09:48:41 crc kubenswrapper[4795]: I1205 09:48:41.801313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6"} Dec 05 09:48:41 crc kubenswrapper[4795]: I1205 09:48:41.801330 4795 scope.go:117] "RemoveContainer" containerID="de00e518750c6134e27c50b6f7accb77d0935903ce82d0e8e373d8cc0716dc2a" Dec 05 09:48:54 crc kubenswrapper[4795]: I1205 09:48:54.878306 4795 scope.go:117] "RemoveContainer" containerID="1f78f8808b364c1620e3ae72c35428bf42b3fc8826b4cf9f8196a60ba4b556c1" Dec 05 09:48:54 crc kubenswrapper[4795]: I1205 09:48:54.915756 4795 scope.go:117] "RemoveContainer" containerID="bc52d8bc9698e7089ec725b556c971b83f27ad964e11a1fdbb2a5a52495d1d0b" Dec 05 09:48:54 crc kubenswrapper[4795]: I1205 09:48:54.963875 4795 scope.go:117] "RemoveContainer" containerID="af76297a7f7ecb38ba43cbc73ab2fa35cc57943d4e5077ca3567e85a33e094a3" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.252536 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r7g2q/must-gather-t9zkl"] Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.258645 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.263904 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r7g2q"/"openshift-service-ca.crt" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.264279 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-r7g2q"/"default-dockercfg-wrvlq" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.264465 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-r7g2q"/"kube-root-ca.crt" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.273825 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r7g2q/must-gather-t9zkl"] Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.414984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46klk\" (UniqueName: \"kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.415068 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.516556 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.516772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46klk\" (UniqueName: \"kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.516982 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.553572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46klk\" (UniqueName: \"kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk\") pod \"must-gather-t9zkl\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:12 crc kubenswrapper[4795]: I1205 09:49:12.585392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:49:13 crc kubenswrapper[4795]: I1205 09:49:13.023798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r7g2q/must-gather-t9zkl"] Dec 05 09:49:13 crc kubenswrapper[4795]: W1205 09:49:13.024820 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76eb2c1b_bd2e_4d60_b0a3_9d2ff41f9397.slice/crio-2d4a30911334e66ab6c8982bab3edbc08a7544b030e4159d14b32032e74b8c02 WatchSource:0}: Error finding container 2d4a30911334e66ab6c8982bab3edbc08a7544b030e4159d14b32032e74b8c02: Status 404 returned error can't find the container with id 2d4a30911334e66ab6c8982bab3edbc08a7544b030e4159d14b32032e74b8c02 Dec 05 09:49:13 crc kubenswrapper[4795]: I1205 09:49:13.126186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" event={"ID":"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397","Type":"ContainerStarted","Data":"2d4a30911334e66ab6c8982bab3edbc08a7544b030e4159d14b32032e74b8c02"} Dec 05 09:49:20 crc kubenswrapper[4795]: I1205 09:49:20.215434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" event={"ID":"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397","Type":"ContainerStarted","Data":"383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65"} Dec 05 09:49:20 crc kubenswrapper[4795]: I1205 09:49:20.216328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" event={"ID":"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397","Type":"ContainerStarted","Data":"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29"} Dec 05 09:49:20 crc kubenswrapper[4795]: I1205 09:49:20.238254 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" podStartSLOduration=2.040150288 podStartE2EDuration="8.23822857s" podCreationTimestamp="2025-12-05 09:49:12 +0000 UTC" firstStartedPulling="2025-12-05 09:49:13.031746276 +0000 UTC m=+5104.604350015" lastFinishedPulling="2025-12-05 09:49:19.229824558 +0000 UTC m=+5110.802428297" observedRunningTime="2025-12-05 09:49:20.232350591 +0000 UTC m=+5111.804954330" watchObservedRunningTime="2025-12-05 09:49:20.23822857 +0000 UTC m=+5111.810832309" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.061360 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-sm9nn"] Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.063743 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.233990 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.234250 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndgk\" (UniqueName: \"kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.336192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.336290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndgk\" (UniqueName: \"kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.336871 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.358774 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndgk\" (UniqueName: \"kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk\") pod \"crc-debug-sm9nn\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:27 crc kubenswrapper[4795]: I1205 09:49:27.389335 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:49:28 crc kubenswrapper[4795]: I1205 09:49:28.309432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" event={"ID":"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95","Type":"ContainerStarted","Data":"0206bef10e010caa9fd9d200ee33ff7ae03061ac62be9030842c9a27a997da7a"} Dec 05 09:49:43 crc kubenswrapper[4795]: E1205 09:49:43.557236 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 05 09:49:43 crc kubenswrapper[4795]: E1205 09:49:43.558439 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qndgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-sm9nn_openshift-must-gather-r7g2q(40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 05 09:49:43 crc kubenswrapper[4795]: E1205 09:49:43.559564 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" Dec 05 09:49:44 crc kubenswrapper[4795]: E1205 09:49:44.550942 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" Dec 05 09:49:57 crc kubenswrapper[4795]: I1205 09:49:57.677840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" event={"ID":"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95","Type":"ContainerStarted","Data":"630afc3a70b153edaa827cb656a9c5461ffb42451018c25397908b0023f00f15"} Dec 05 09:49:57 crc kubenswrapper[4795]: I1205 09:49:57.705908 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" podStartSLOduration=0.919906753 podStartE2EDuration="30.705883638s" podCreationTimestamp="2025-12-05 09:49:27 +0000 UTC" firstStartedPulling="2025-12-05 09:49:27.457177134 +0000 UTC m=+5119.029780873" lastFinishedPulling="2025-12-05 09:49:57.243154019 +0000 UTC m=+5148.815757758" observedRunningTime="2025-12-05 09:49:57.693385409 +0000 UTC m=+5149.265989148" watchObservedRunningTime="2025-12-05 09:49:57.705883638 +0000 UTC m=+5149.278487377" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.231365 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.240487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.261296 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.315886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.316154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86kd2\" (UniqueName: \"kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.316286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.422242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86kd2\" (UniqueName: \"kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.422329 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.422387 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.422946 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.423426 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.813574 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86kd2\" (UniqueName: \"kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2\") pod \"community-operators-xp4zj\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:00 crc kubenswrapper[4795]: I1205 09:51:00.869496 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:01 crc kubenswrapper[4795]: I1205 09:51:01.562893 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:02 crc kubenswrapper[4795]: I1205 09:51:02.425473 4795 generic.go:334] "Generic (PLEG): container finished" podID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerID="453e793eeeedd1043520b23dd65ace8394116b86fbe75e95e5b3e71908bc38d7" exitCode=0 Dec 05 09:51:02 crc kubenswrapper[4795]: I1205 09:51:02.425585 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerDied","Data":"453e793eeeedd1043520b23dd65ace8394116b86fbe75e95e5b3e71908bc38d7"} Dec 05 09:51:02 crc kubenswrapper[4795]: I1205 09:51:02.425729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerStarted","Data":"1eb90ca4b0f9eb4313e2957826a432e02a780e8d0f65cc8577c796411550a1cd"} Dec 05 09:51:04 crc kubenswrapper[4795]: I1205 09:51:04.448567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerStarted","Data":"45c2a92dcfba235b25dd2021d1c19030a03ec587dc645a8350b2a62a0194bed6"} Dec 05 09:51:05 crc kubenswrapper[4795]: I1205 09:51:05.462011 4795 generic.go:334] "Generic (PLEG): container finished" podID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerID="45c2a92dcfba235b25dd2021d1c19030a03ec587dc645a8350b2a62a0194bed6" exitCode=0 Dec 05 09:51:05 crc kubenswrapper[4795]: I1205 09:51:05.462332 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerDied","Data":"45c2a92dcfba235b25dd2021d1c19030a03ec587dc645a8350b2a62a0194bed6"} Dec 05 09:51:06 crc kubenswrapper[4795]: I1205 09:51:06.476295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerStarted","Data":"7d9d42e97a1864154224cbec8feb658dae6a3ce50b0ba58a1035e9548a9f8d94"} Dec 05 09:51:06 crc kubenswrapper[4795]: I1205 09:51:06.503683 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xp4zj" podStartSLOduration=3.079128818 podStartE2EDuration="6.503603259s" podCreationTimestamp="2025-12-05 09:51:00 +0000 UTC" firstStartedPulling="2025-12-05 09:51:02.427528368 +0000 UTC m=+5214.000132107" lastFinishedPulling="2025-12-05 09:51:05.852002799 +0000 UTC m=+5217.424606548" observedRunningTime="2025-12-05 09:51:06.494297376 +0000 UTC m=+5218.066901115" watchObservedRunningTime="2025-12-05 09:51:06.503603259 +0000 UTC m=+5218.076206998" Dec 05 09:51:07 crc kubenswrapper[4795]: I1205 09:51:07.485834 4795 generic.go:334] "Generic (PLEG): container finished" podID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" containerID="630afc3a70b153edaa827cb656a9c5461ffb42451018c25397908b0023f00f15" exitCode=0 Dec 05 09:51:07 crc kubenswrapper[4795]: I1205 09:51:07.485922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" event={"ID":"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95","Type":"ContainerDied","Data":"630afc3a70b153edaa827cb656a9c5461ffb42451018c25397908b0023f00f15"} Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.596424 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.641235 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-sm9nn"] Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.651119 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-sm9nn"] Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.703745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host\") pod \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.703913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host" (OuterVolumeSpecName: "host") pod "40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" (UID: "40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.704027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qndgk\" (UniqueName: \"kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk\") pod \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\" (UID: \"40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95\") " Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.704476 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-host\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.713135 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk" (OuterVolumeSpecName: "kube-api-access-qndgk") pod "40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" (UID: "40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95"). InnerVolumeSpecName "kube-api-access-qndgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.765025 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" path="/var/lib/kubelet/pods/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95/volumes" Dec 05 09:51:08 crc kubenswrapper[4795]: I1205 09:51:08.807500 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qndgk\" (UniqueName: \"kubernetes.io/projected/40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95-kube-api-access-qndgk\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.506258 4795 scope.go:117] "RemoveContainer" containerID="630afc3a70b153edaa827cb656a9c5461ffb42451018c25397908b0023f00f15" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.506889 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-sm9nn" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.850031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-vb5hk"] Dec 05 09:51:09 crc kubenswrapper[4795]: E1205 09:51:09.850517 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" containerName="container-00" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.850531 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" containerName="container-00" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.850754 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40fbd174-f7a2-40d9-8eb4-c7ee2c8bfd95" containerName="container-00" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.851472 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.934713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:09 crc kubenswrapper[4795]: I1205 09:51:09.935193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kr7\" (UniqueName: \"kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.037655 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kr7\" (UniqueName: \"kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.038127 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.038355 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.061081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kr7\" (UniqueName: \"kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7\") pod \"crc-debug-vb5hk\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.173500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.530741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" event={"ID":"2e4ac8e3-2705-4193-89b8-967dd6ababe0","Type":"ContainerStarted","Data":"e18338c1659cf76f9abfbb2a0135c4b7f08a95a38f6a93fa0d4c0a3bf9ab46ff"} Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.827707 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.827790 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.869971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.872669 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:10 crc kubenswrapper[4795]: I1205 09:51:10.937659 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:11 crc kubenswrapper[4795]: I1205 09:51:11.543371 4795 generic.go:334] "Generic (PLEG): container finished" podID="2e4ac8e3-2705-4193-89b8-967dd6ababe0" containerID="9d3a8b0923057ffcfa34d2c704d877a8653d7953f0d6df091b2506d87cfad516" exitCode=0 Dec 05 09:51:11 crc kubenswrapper[4795]: I1205 09:51:11.543443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" event={"ID":"2e4ac8e3-2705-4193-89b8-967dd6ababe0","Type":"ContainerDied","Data":"9d3a8b0923057ffcfa34d2c704d877a8653d7953f0d6df091b2506d87cfad516"} Dec 05 09:51:11 crc kubenswrapper[4795]: I1205 09:51:11.608290 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:11 crc kubenswrapper[4795]: I1205 09:51:11.669346 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.280163 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.312119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9kr7\" (UniqueName: \"kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7\") pod \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.312299 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host\") pod \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\" (UID: \"2e4ac8e3-2705-4193-89b8-967dd6ababe0\") " Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.312798 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host" (OuterVolumeSpecName: "host") pod "2e4ac8e3-2705-4193-89b8-967dd6ababe0" (UID: "2e4ac8e3-2705-4193-89b8-967dd6ababe0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.320042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7" (OuterVolumeSpecName: "kube-api-access-w9kr7") pod "2e4ac8e3-2705-4193-89b8-967dd6ababe0" (UID: "2e4ac8e3-2705-4193-89b8-967dd6ababe0"). InnerVolumeSpecName "kube-api-access-w9kr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.414532 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9kr7\" (UniqueName: \"kubernetes.io/projected/2e4ac8e3-2705-4193-89b8-967dd6ababe0-kube-api-access-w9kr7\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.414569 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2e4ac8e3-2705-4193-89b8-967dd6ababe0-host\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.565746 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.565769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-vb5hk" event={"ID":"2e4ac8e3-2705-4193-89b8-967dd6ababe0","Type":"ContainerDied","Data":"e18338c1659cf76f9abfbb2a0135c4b7f08a95a38f6a93fa0d4c0a3bf9ab46ff"} Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.565830 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e18338c1659cf76f9abfbb2a0135c4b7f08a95a38f6a93fa0d4c0a3bf9ab46ff" Dec 05 09:51:13 crc kubenswrapper[4795]: I1205 09:51:13.565908 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xp4zj" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="registry-server" containerID="cri-o://7d9d42e97a1864154224cbec8feb658dae6a3ce50b0ba58a1035e9548a9f8d94" gracePeriod=2 Dec 05 09:51:14 crc kubenswrapper[4795]: I1205 09:51:14.198439 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-vb5hk"] Dec 05 09:51:14 crc kubenswrapper[4795]: I1205 09:51:14.210047 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-vb5hk"] Dec 05 09:51:14 crc kubenswrapper[4795]: I1205 09:51:14.617117 4795 generic.go:334] "Generic (PLEG): container finished" podID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerID="7d9d42e97a1864154224cbec8feb658dae6a3ce50b0ba58a1035e9548a9f8d94" exitCode=0 Dec 05 09:51:14 crc kubenswrapper[4795]: I1205 09:51:14.617445 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerDied","Data":"7d9d42e97a1864154224cbec8feb658dae6a3ce50b0ba58a1035e9548a9f8d94"} Dec 05 09:51:14 crc kubenswrapper[4795]: I1205 09:51:14.771038 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e4ac8e3-2705-4193-89b8-967dd6ababe0" path="/var/lib/kubelet/pods/2e4ac8e3-2705-4193-89b8-967dd6ababe0/volumes" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.302002 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.361863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities\") pod \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.362579 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities" (OuterVolumeSpecName: "utilities") pod "51ef45ff-0983-4eea-a02d-1d9277e8ff1b" (UID: "51ef45ff-0983-4eea-a02d-1d9277e8ff1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.452568 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-qjzmv"] Dec 05 09:51:15 crc kubenswrapper[4795]: E1205 09:51:15.453381 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="extract-content" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.453408 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="extract-content" Dec 05 09:51:15 crc kubenswrapper[4795]: E1205 09:51:15.453427 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="registry-server" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.453435 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="registry-server" Dec 05 09:51:15 crc kubenswrapper[4795]: E1205 09:51:15.453461 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e4ac8e3-2705-4193-89b8-967dd6ababe0" containerName="container-00" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.453470 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e4ac8e3-2705-4193-89b8-967dd6ababe0" containerName="container-00" Dec 05 09:51:15 crc kubenswrapper[4795]: E1205 09:51:15.453516 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="extract-utilities" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.453527 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="extract-utilities" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.454671 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e4ac8e3-2705-4193-89b8-967dd6ababe0" containerName="container-00" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.454714 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" containerName="registry-server" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.455531 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.462974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content\") pod \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.463126 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86kd2\" (UniqueName: \"kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2\") pod \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\" (UID: \"51ef45ff-0983-4eea-a02d-1d9277e8ff1b\") " Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.463784 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.474678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2" (OuterVolumeSpecName: "kube-api-access-86kd2") pod "51ef45ff-0983-4eea-a02d-1d9277e8ff1b" (UID: "51ef45ff-0983-4eea-a02d-1d9277e8ff1b"). InnerVolumeSpecName "kube-api-access-86kd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.565829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbbw\" (UniqueName: \"kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.566186 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.566511 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86kd2\" (UniqueName: \"kubernetes.io/projected/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-kube-api-access-86kd2\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.572001 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51ef45ff-0983-4eea-a02d-1d9277e8ff1b" (UID: "51ef45ff-0983-4eea-a02d-1d9277e8ff1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.629222 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4zj" event={"ID":"51ef45ff-0983-4eea-a02d-1d9277e8ff1b","Type":"ContainerDied","Data":"1eb90ca4b0f9eb4313e2957826a432e02a780e8d0f65cc8577c796411550a1cd"} Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.629273 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4zj" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.629292 4795 scope.go:117] "RemoveContainer" containerID="7d9d42e97a1864154224cbec8feb658dae6a3ce50b0ba58a1035e9548a9f8d94" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.654211 4795 scope.go:117] "RemoveContainer" containerID="45c2a92dcfba235b25dd2021d1c19030a03ec587dc645a8350b2a62a0194bed6" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.667965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.668110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbbw\" (UniqueName: \"kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.668237 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51ef45ff-0983-4eea-a02d-1d9277e8ff1b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.668656 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.686056 4795 scope.go:117] "RemoveContainer" containerID="453e793eeeedd1043520b23dd65ace8394116b86fbe75e95e5b3e71908bc38d7" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.691106 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.691633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbbw\" (UniqueName: \"kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw\") pod \"crc-debug-qjzmv\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.704747 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xp4zj"] Dec 05 09:51:15 crc kubenswrapper[4795]: I1205 09:51:15.826115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:16 crc kubenswrapper[4795]: I1205 09:51:16.643839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" event={"ID":"35dd1907-13a0-45d5-add3-13be5a407f70","Type":"ContainerStarted","Data":"1070aaaf3e70f250957326cdd19f9ede56a7e3933e44500e51e1a36e5d0f7817"} Dec 05 09:51:16 crc kubenswrapper[4795]: I1205 09:51:16.762763 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ef45ff-0983-4eea-a02d-1d9277e8ff1b" path="/var/lib/kubelet/pods/51ef45ff-0983-4eea-a02d-1d9277e8ff1b/volumes" Dec 05 09:51:17 crc kubenswrapper[4795]: I1205 09:51:17.657275 4795 generic.go:334] "Generic (PLEG): container finished" podID="35dd1907-13a0-45d5-add3-13be5a407f70" containerID="55cad5b6e588c92f15cb18900874aedd3ef50f3880ef3f38b15d52a63889f97f" exitCode=0 Dec 05 09:51:17 crc kubenswrapper[4795]: I1205 09:51:17.657435 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" event={"ID":"35dd1907-13a0-45d5-add3-13be5a407f70","Type":"ContainerDied","Data":"55cad5b6e588c92f15cb18900874aedd3ef50f3880ef3f38b15d52a63889f97f"} Dec 05 09:51:17 crc kubenswrapper[4795]: I1205 09:51:17.722240 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-qjzmv"] Dec 05 09:51:17 crc kubenswrapper[4795]: I1205 09:51:17.737841 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r7g2q/crc-debug-qjzmv"] Dec 05 09:51:18 crc kubenswrapper[4795]: I1205 09:51:18.778401 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:18 crc kubenswrapper[4795]: I1205 09:51:18.943543 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfbbw\" (UniqueName: \"kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw\") pod \"35dd1907-13a0-45d5-add3-13be5a407f70\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " Dec 05 09:51:18 crc kubenswrapper[4795]: I1205 09:51:18.944199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host\") pod \"35dd1907-13a0-45d5-add3-13be5a407f70\" (UID: \"35dd1907-13a0-45d5-add3-13be5a407f70\") " Dec 05 09:51:18 crc kubenswrapper[4795]: I1205 09:51:18.946275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host" (OuterVolumeSpecName: "host") pod "35dd1907-13a0-45d5-add3-13be5a407f70" (UID: "35dd1907-13a0-45d5-add3-13be5a407f70"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 05 09:51:18 crc kubenswrapper[4795]: I1205 09:51:18.975908 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw" (OuterVolumeSpecName: "kube-api-access-sfbbw") pod "35dd1907-13a0-45d5-add3-13be5a407f70" (UID: "35dd1907-13a0-45d5-add3-13be5a407f70"). InnerVolumeSpecName "kube-api-access-sfbbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:51:19 crc kubenswrapper[4795]: I1205 09:51:19.046829 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/35dd1907-13a0-45d5-add3-13be5a407f70-host\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:19 crc kubenswrapper[4795]: I1205 09:51:19.046863 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfbbw\" (UniqueName: \"kubernetes.io/projected/35dd1907-13a0-45d5-add3-13be5a407f70-kube-api-access-sfbbw\") on node \"crc\" DevicePath \"\"" Dec 05 09:51:19 crc kubenswrapper[4795]: I1205 09:51:19.677020 4795 scope.go:117] "RemoveContainer" containerID="55cad5b6e588c92f15cb18900874aedd3ef50f3880ef3f38b15d52a63889f97f" Dec 05 09:51:19 crc kubenswrapper[4795]: I1205 09:51:19.677096 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/crc-debug-qjzmv" Dec 05 09:51:20 crc kubenswrapper[4795]: I1205 09:51:20.759937 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35dd1907-13a0-45d5-add3-13be5a407f70" path="/var/lib/kubelet/pods/35dd1907-13a0-45d5-add3-13be5a407f70/volumes" Dec 05 09:51:40 crc kubenswrapper[4795]: I1205 09:51:40.827116 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:51:40 crc kubenswrapper[4795]: I1205 09:51:40.827963 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:51:42 crc kubenswrapper[4795]: I1205 09:51:42.480460 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bbf5b658-5cbs8_17050311-556c-4364-bd99-195d690178cb/barbican-api/0.log" Dec 05 09:51:42 crc kubenswrapper[4795]: I1205 09:51:42.532321 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-79bbf5b658-5cbs8_17050311-556c-4364-bd99-195d690178cb/barbican-api-log/0.log" Dec 05 09:51:42 crc kubenswrapper[4795]: I1205 09:51:42.981500 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d6c47b668-nqmgd_726dc98e-9fe1-4b31-ba77-29d9e165b6d6/barbican-keystone-listener/0.log" Dec 05 09:51:43 crc kubenswrapper[4795]: I1205 09:51:43.024040 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d6c47b668-nqmgd_726dc98e-9fe1-4b31-ba77-29d9e165b6d6/barbican-keystone-listener-log/0.log" Dec 05 09:51:43 crc kubenswrapper[4795]: I1205 09:51:43.407529 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d6d5498f5-mkdbf_c651edcf-b0db-4e86-9c04-6b26df481c95/barbican-worker/0.log" Dec 05 09:51:43 crc kubenswrapper[4795]: I1205 09:51:43.414459 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6d6d5498f5-mkdbf_c651edcf-b0db-4e86-9c04-6b26df481c95/barbican-worker-log/0.log" Dec 05 09:51:43 crc kubenswrapper[4795]: I1205 09:51:43.812488 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_aee9ade7-8fe6-4548-aa94-032d421ac9ab/ceilometer-central-agent/0.log" Dec 05 09:51:43 crc kubenswrapper[4795]: I1205 09:51:43.876496 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-k5shd_5e42d8c8-afcc-4c91-a967-5aac94f29019/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:44 crc kubenswrapper[4795]: I1205 09:51:44.005139 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_aee9ade7-8fe6-4548-aa94-032d421ac9ab/ceilometer-notification-agent/0.log" Dec 05 09:51:44 crc kubenswrapper[4795]: I1205 09:51:44.794655 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_aee9ade7-8fe6-4548-aa94-032d421ac9ab/sg-core/0.log" Dec 05 09:51:44 crc kubenswrapper[4795]: I1205 09:51:44.820275 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_aee9ade7-8fe6-4548-aa94-032d421ac9ab/proxy-httpd/0.log" Dec 05 09:51:44 crc kubenswrapper[4795]: I1205 09:51:44.910689 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_11d903b1-31af-4f63-ac26-a2bdb125af5b/cinder-api/0.log" Dec 05 09:51:45 crc kubenswrapper[4795]: I1205 09:51:45.280418 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_11d903b1-31af-4f63-ac26-a2bdb125af5b/cinder-api-log/0.log" Dec 05 09:51:45 crc kubenswrapper[4795]: I1205 09:51:45.380856 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce/cinder-scheduler/0.log" Dec 05 09:51:45 crc kubenswrapper[4795]: I1205 09:51:45.424930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_f8ac5dea-f7ef-4ae3-91fc-5fbaf174c1ce/probe/0.log" Dec 05 09:51:45 crc kubenswrapper[4795]: I1205 09:51:45.662147 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-bkjkg_fdfda940-145b-497f-8adc-d001a4f852ed/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:45 crc kubenswrapper[4795]: I1205 09:51:45.803601 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-qh2cz_2733fd67-3848-4b52-8246-0aa3a4f60d10/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:46 crc kubenswrapper[4795]: I1205 09:51:46.058779 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zwsqp_b895bd86-3c76-4653-8527-d1cfef368c37/init/0.log" Dec 05 09:51:46 crc kubenswrapper[4795]: I1205 09:51:46.377868 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zwsqp_b895bd86-3c76-4653-8527-d1cfef368c37/init/0.log" Dec 05 09:51:47 crc kubenswrapper[4795]: I1205 09:51:47.909882 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f3d4d78-a614-4724-ba42-bc6f0a44be83/glance-log/0.log" Dec 05 09:51:47 crc kubenswrapper[4795]: I1205 09:51:47.962760 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8f3d4d78-a614-4724-ba42-bc6f0a44be83/glance-httpd/0.log" Dec 05 09:51:47 crc kubenswrapper[4795]: I1205 09:51:47.966405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-d4w2j_7eb44bb0-b9bd-4a64-97de-d1c08b927625/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:47 crc kubenswrapper[4795]: I1205 09:51:47.970524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_acfa2531-9c7e-4017-b32d-3a4b07038cca/glance-httpd/0.log" Dec 05 09:51:48 crc kubenswrapper[4795]: I1205 09:51:48.150736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-zwsqp_b895bd86-3c76-4653-8527-d1cfef368c37/dnsmasq-dns/0.log" Dec 05 09:51:48 crc kubenswrapper[4795]: I1205 09:51:48.322452 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57b485fdb4-h9cjs_f89d9173-0065-4beb-a1b6-ba7be5094a58/horizon/3.log" Dec 05 09:51:48 crc kubenswrapper[4795]: I1205 09:51:48.438399 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_acfa2531-9c7e-4017-b32d-3a4b07038cca/glance-log/0.log" Dec 05 09:51:48 crc kubenswrapper[4795]: I1205 09:51:48.621992 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57b485fdb4-h9cjs_f89d9173-0065-4beb-a1b6-ba7be5094a58/horizon/2.log" Dec 05 09:51:49 crc kubenswrapper[4795]: I1205 09:51:49.075314 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2f589_74a4b975-a0ad-4798-8f20-2afce09644f9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:49 crc kubenswrapper[4795]: I1205 09:51:49.181815 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tg6pn_d2617c20-6235-43aa-85e0-6bed6d4649e3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:49 crc kubenswrapper[4795]: I1205 09:51:49.339731 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-57b485fdb4-h9cjs_f89d9173-0065-4beb-a1b6-ba7be5094a58/horizon-log/0.log" Dec 05 09:51:49 crc kubenswrapper[4795]: I1205 09:51:49.506932 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29415421-ppwmt_b29b36d5-2393-4db6-a124-a9e2adc28069/keystone-cron/0.log" Dec 05 09:51:50 crc kubenswrapper[4795]: I1205 09:51:50.116378 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85d5d69654-vzspj_f0bb2937-2db6-41a5-b930-b1d479cd8a5f/keystone-api/0.log" Dec 05 09:51:50 crc kubenswrapper[4795]: I1205 09:51:50.229563 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_1fd34f8e-69eb-4af2-ae44-4da71219ef35/kube-state-metrics/0.log" Dec 05 09:51:50 crc kubenswrapper[4795]: I1205 09:51:50.358099 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-g4vq9_cfe4932a-495e-46cb-981d-71465ed7e1ff/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:51 crc kubenswrapper[4795]: I1205 09:51:51.109225 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6qzbt_7ae18232-77c3-44cb-909e-fda5169b4d1c/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:51 crc kubenswrapper[4795]: I1205 09:51:51.216931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6766d78d6c-l65vj_8ada623e-3e62-480c-a681-19685e13dc82/neutron-httpd/0.log" Dec 05 09:51:51 crc kubenswrapper[4795]: I1205 09:51:51.496729 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6766d78d6c-l65vj_8ada623e-3e62-480c-a681-19685e13dc82/neutron-api/0.log" Dec 05 09:51:52 crc kubenswrapper[4795]: I1205 09:51:52.625953 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d04716e6-118a-45f7-b0a2-038650cb3baf/nova-cell0-conductor-conductor/0.log" Dec 05 09:51:53 crc kubenswrapper[4795]: I1205 09:51:53.345140 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_51a69c13-aa37-4fad-a00f-2c1aafc627c4/nova-api-log/0.log" Dec 05 09:51:53 crc kubenswrapper[4795]: I1205 09:51:53.360638 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6d31dbcf-b637-4b29-a68c-0b8f4226caf5/nova-cell1-conductor-conductor/0.log" Dec 05 09:51:53 crc kubenswrapper[4795]: I1205 09:51:53.636320 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a8fc111b-5e38-4955-a1ec-dbd8e155fd2f/nova-cell1-novncproxy-novncproxy/0.log" Dec 05 09:51:53 crc kubenswrapper[4795]: I1205 09:51:53.836773 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-m5khq_6fca34cb-0c72-422c-86e9-638584bb9dcb/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:53 crc kubenswrapper[4795]: I1205 09:51:53.893678 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_51a69c13-aa37-4fad-a00f-2c1aafc627c4/nova-api-api/0.log" Dec 05 09:51:54 crc kubenswrapper[4795]: I1205 09:51:54.282002 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_67a74420-332a-4b9c-b677-d7c61bb7ce5e/nova-metadata-log/0.log" Dec 05 09:51:54 crc kubenswrapper[4795]: I1205 09:51:54.826088 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_09a55d95-050f-4262-9bb4-7dc81ae6ea34/mysql-bootstrap/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.030141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_09a55d95-050f-4262-9bb4-7dc81ae6ea34/mysql-bootstrap/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.041132 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_cf163493-6f4f-47b8-9478-683cf5f07868/nova-scheduler-scheduler/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.306087 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_09a55d95-050f-4262-9bb4-7dc81ae6ea34/galera/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.495565 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6/mysql-bootstrap/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.821689 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6/mysql-bootstrap/0.log" Dec 05 09:51:55 crc kubenswrapper[4795]: I1205 09:51:55.840355 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e8c2d84c-15c8-48c5-a0d2-ed17cb2c09a6/galera/0.log" Dec 05 09:51:56 crc kubenswrapper[4795]: I1205 09:51:56.144695 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_61c38f36-24c0-4c36-986c-8a7552eadfbb/openstackclient/0.log" Dec 05 09:51:56 crc kubenswrapper[4795]: I1205 09:51:56.236141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-842lt_4e19a64f-4ae5-4731-98e0-dfef56849949/openstack-network-exporter/0.log" Dec 05 09:51:56 crc kubenswrapper[4795]: I1205 09:51:56.844477 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_67a74420-332a-4b9c-b677-d7c61bb7ce5e/nova-metadata-metadata/0.log" Dec 05 09:51:56 crc kubenswrapper[4795]: I1205 09:51:56.880947 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5vnnk_168f746c-97aa-4a6b-9e54-d365580aad3e/ovsdb-server-init/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.328100 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5vnnk_168f746c-97aa-4a6b-9e54-d365580aad3e/ovs-vswitchd/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.381394 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5vnnk_168f746c-97aa-4a6b-9e54-d365580aad3e/ovsdb-server-init/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.545485 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5vnnk_168f746c-97aa-4a6b-9e54-d365580aad3e/ovsdb-server/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.707496 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-pbgkm_ec90f56f-9ed8-4175-9736-6e0f07d7078f/ovn-controller/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.885443 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-md59b_3699268e-1a7d-4a95-9a21-538ddfff9e54/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:51:57 crc kubenswrapper[4795]: I1205 09:51:57.995654 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d7fe8fd-0377-462d-b17e-1ec92a8d0464/openstack-network-exporter/0.log" Dec 05 09:51:58 crc kubenswrapper[4795]: I1205 09:51:58.086322 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6d7fe8fd-0377-462d-b17e-1ec92a8d0464/ovn-northd/0.log" Dec 05 09:51:58 crc kubenswrapper[4795]: I1205 09:51:58.420475 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_90e5bcaa-7346-4ac2-bb1b-453e46dec234/openstack-network-exporter/0.log" Dec 05 09:51:58 crc kubenswrapper[4795]: I1205 09:51:58.502787 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_90e5bcaa-7346-4ac2-bb1b-453e46dec234/ovsdbserver-nb/0.log" Dec 05 09:51:58 crc kubenswrapper[4795]: I1205 09:51:58.651755 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_770d6237-f0f8-4646-9df7-85f07fa9f48b/openstack-network-exporter/0.log" Dec 05 09:51:58 crc kubenswrapper[4795]: I1205 09:51:58.875876 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_770d6237-f0f8-4646-9df7-85f07fa9f48b/ovsdbserver-sb/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.187320 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56588789f4-7xbdx_40ecc6c1-814a-40dc-988b-d4b67a58794b/placement-api/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.303994 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5139f934-4821-4038-9401-c22f469bf070/setup-container/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.403087 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56588789f4-7xbdx_40ecc6c1-814a-40dc-988b-d4b67a58794b/placement-log/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.651594 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5139f934-4821-4038-9401-c22f469bf070/rabbitmq/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.683792 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_5139f934-4821-4038-9401-c22f469bf070/setup-container/0.log" Dec 05 09:51:59 crc kubenswrapper[4795]: I1205 09:51:59.821234 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ad9b797-2884-4af6-8a64-8f82b3523d3e/setup-container/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.063821 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ad9b797-2884-4af6-8a64-8f82b3523d3e/setup-container/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.165812 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7ad9b797-2884-4af6-8a64-8f82b3523d3e/rabbitmq/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.191448 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_468214b1-1b8a-4714-a2b5-9913dead10a6/memcached/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.253415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-k9222_c21efe0b-8f08-49d0-9723-8497f78e7471/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.879085 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-pvmj4_e83a9af4-07e8-4f0e-a764-19e3f093fb2a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:00 crc kubenswrapper[4795]: I1205 09:52:00.889004 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-c4gvd_62af26c9-a1a2-43e6-9f1e-0ea9e48042bc/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:01 crc kubenswrapper[4795]: I1205 09:52:01.030305 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7xjhl_939bdde4-5c5c-4d05-bf99-f5ae1fe7216a/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:01 crc kubenswrapper[4795]: I1205 09:52:01.767977 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-584d9d78f9-hwfvk_78ae9e33-4a1a-4296-8b17-65c7775bd5ec/proxy-httpd/0.log" Dec 05 09:52:01 crc kubenswrapper[4795]: I1205 09:52:01.796820 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ktqqs_257beaf8-8804-48c7-ac78-c12ace238dd2/ssh-known-hosts-edpm-deployment/0.log" Dec 05 09:52:01 crc kubenswrapper[4795]: I1205 09:52:01.845960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-584d9d78f9-hwfvk_78ae9e33-4a1a-4296-8b17-65c7775bd5ec/proxy-server/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.038001 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-m46d5_7d94f47e-cb5c-427e-b529-dee69261109f/swift-ring-rebalance/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.107216 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/account-auditor/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.145756 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/account-replicator/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.219222 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/account-reaper/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.391566 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/account-server/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.456930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/container-replicator/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.489518 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/container-auditor/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.561118 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/container-updater/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.609725 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/container-server/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.805104 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/object-auditor/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.834797 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/object-replicator/0.log" Dec 05 09:52:02 crc kubenswrapper[4795]: I1205 09:52:02.892623 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/object-expirer/0.log" Dec 05 09:52:03 crc kubenswrapper[4795]: I1205 09:52:03.185038 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/swift-recon-cron/0.log" Dec 05 09:52:03 crc kubenswrapper[4795]: I1205 09:52:03.185358 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/rsync/0.log" Dec 05 09:52:03 crc kubenswrapper[4795]: I1205 09:52:03.187381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/object-server/0.log" Dec 05 09:52:03 crc kubenswrapper[4795]: I1205 09:52:03.189311 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c346ae47-7294-4960-b4f3-9d791c931a12/object-updater/0.log" Dec 05 09:52:03 crc kubenswrapper[4795]: I1205 09:52:03.973466 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-s8cw6_594805cd-d62b-47e5-9ad8-1c423b5fcebd/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:04 crc kubenswrapper[4795]: I1205 09:52:04.076279 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e7982c9e-f3b2-416c-bafa-c089c65e882c/test-operator-logs-container/0.log" Dec 05 09:52:04 crc kubenswrapper[4795]: I1205 09:52:04.143387 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_7f223c39-817f-4bac-9c3b-490359a0e44d/tempest-tests-tempest-tests-runner/0.log" Dec 05 09:52:04 crc kubenswrapper[4795]: I1205 09:52:04.696804 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-v2qbg_b98ff6b5-6e26-499a-a777-922aaa749b13/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 05 09:52:10 crc kubenswrapper[4795]: I1205 09:52:10.827408 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:52:10 crc kubenswrapper[4795]: I1205 09:52:10.828170 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:52:10 crc kubenswrapper[4795]: I1205 09:52:10.828244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 09:52:10 crc kubenswrapper[4795]: I1205 09:52:10.829781 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 09:52:10 crc kubenswrapper[4795]: I1205 09:52:10.829951 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" gracePeriod=600 Dec 05 09:52:10 crc kubenswrapper[4795]: E1205 09:52:10.956866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:52:11 crc kubenswrapper[4795]: I1205 09:52:11.298664 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" exitCode=0 Dec 05 09:52:11 crc kubenswrapper[4795]: I1205 09:52:11.298740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6"} Dec 05 09:52:11 crc kubenswrapper[4795]: I1205 09:52:11.298783 4795 scope.go:117] "RemoveContainer" containerID="0463266c4052607fd5db6f7384bcd4a6cdf836c5b67ce12ebe3126e72b49b3b3" Dec 05 09:52:11 crc kubenswrapper[4795]: I1205 09:52:11.299723 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:52:11 crc kubenswrapper[4795]: E1205 09:52:11.300005 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:52:23 crc kubenswrapper[4795]: I1205 09:52:23.751294 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:52:23 crc kubenswrapper[4795]: E1205 09:52:23.752085 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.818501 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:30 crc kubenswrapper[4795]: E1205 09:52:30.819468 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35dd1907-13a0-45d5-add3-13be5a407f70" containerName="container-00" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.819481 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="35dd1907-13a0-45d5-add3-13be5a407f70" containerName="container-00" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.819736 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="35dd1907-13a0-45d5-add3-13be5a407f70" containerName="container-00" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.824856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.859848 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.969224 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.969359 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thmtw\" (UniqueName: \"kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:30 crc kubenswrapper[4795]: I1205 09:52:30.969416 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.072284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.072548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.072607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thmtw\" (UniqueName: \"kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.073084 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.073415 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.099517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thmtw\" (UniqueName: \"kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw\") pod \"redhat-marketplace-pzzsx\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.146639 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:31 crc kubenswrapper[4795]: I1205 09:52:31.858809 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:32 crc kubenswrapper[4795]: I1205 09:52:32.550333 4795 generic.go:334] "Generic (PLEG): container finished" podID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerID="6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f" exitCode=0 Dec 05 09:52:32 crc kubenswrapper[4795]: I1205 09:52:32.550430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerDied","Data":"6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f"} Dec 05 09:52:32 crc kubenswrapper[4795]: I1205 09:52:32.550662 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerStarted","Data":"886b2664737f20a7383b8a9e312e496605889a251b166382d81cf4c705b4ea82"} Dec 05 09:52:33 crc kubenswrapper[4795]: I1205 09:52:33.564854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerStarted","Data":"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f"} Dec 05 09:52:34 crc kubenswrapper[4795]: I1205 09:52:34.577144 4795 generic.go:334] "Generic (PLEG): container finished" podID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerID="06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f" exitCode=0 Dec 05 09:52:34 crc kubenswrapper[4795]: I1205 09:52:34.577238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerDied","Data":"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f"} Dec 05 09:52:35 crc kubenswrapper[4795]: I1205 09:52:35.597875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerStarted","Data":"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3"} Dec 05 09:52:35 crc kubenswrapper[4795]: I1205 09:52:35.633416 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pzzsx" podStartSLOduration=3.219698307 podStartE2EDuration="5.633395332s" podCreationTimestamp="2025-12-05 09:52:30 +0000 UTC" firstStartedPulling="2025-12-05 09:52:32.552470683 +0000 UTC m=+5304.125074422" lastFinishedPulling="2025-12-05 09:52:34.966167708 +0000 UTC m=+5306.538771447" observedRunningTime="2025-12-05 09:52:35.627818001 +0000 UTC m=+5307.200421740" watchObservedRunningTime="2025-12-05 09:52:35.633395332 +0000 UTC m=+5307.205999071" Dec 05 09:52:36 crc kubenswrapper[4795]: I1205 09:52:36.747340 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:52:36 crc kubenswrapper[4795]: E1205 09:52:36.747640 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.147201 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.147898 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.211761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.491661 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/util/0.log" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.721244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.794026 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/pull/0.log" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.802770 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.861430 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/util/0.log" Dec 05 09:52:41 crc kubenswrapper[4795]: I1205 09:52:41.920224 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/pull/0.log" Dec 05 09:52:42 crc kubenswrapper[4795]: I1205 09:52:42.140193 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/pull/0.log" Dec 05 09:52:42 crc kubenswrapper[4795]: I1205 09:52:42.152039 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/util/0.log" Dec 05 09:52:42 crc kubenswrapper[4795]: I1205 09:52:42.274802 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3fdeabd598e666549f1e5f516fb59960bf98e2529c8d7de3743fa0cb4dh7qwf_2c87c120-561f-4ce2-b47c-b99fb3ea4283/extract/0.log" Dec 05 09:52:42 crc kubenswrapper[4795]: I1205 09:52:42.499491 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-fgtwh_f42b62b8-5856-4300-8bf6-b2299f1b5612/kube-rbac-proxy/0.log" Dec 05 09:52:42 crc kubenswrapper[4795]: I1205 09:52:42.556344 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-fgtwh_f42b62b8-5856-4300-8bf6-b2299f1b5612/manager/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.046525 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jcbss_4d920ea1-76ae-4bb3-831f-e83ac4d57fbe/kube-rbac-proxy/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.195589 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-jcbss_4d920ea1-76ae-4bb3-831f-e83ac4d57fbe/manager/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.317690 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-s6td2_67fad932-d045-4ee7-ae85-bf528a431eb3/kube-rbac-proxy/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.417872 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-s6td2_67fad932-d045-4ee7-ae85-bf528a431eb3/manager/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.638442 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-kvhsl_6fb6884d-9a5b-40bd-bc15-d51a6a645645/kube-rbac-proxy/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.683223 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pzzsx" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="registry-server" containerID="cri-o://43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3" gracePeriod=2 Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.782781 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-kvhsl_6fb6884d-9a5b-40bd-bc15-d51a6a645645/manager/0.log" Dec 05 09:52:43 crc kubenswrapper[4795]: I1205 09:52:43.861224 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-t29gj_e8ef9580-dae6-4db8-aa6d-5c600b8ae507/kube-rbac-proxy/0.log" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.187559 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-t29gj_e8ef9580-dae6-4db8-aa6d-5c600b8ae507/manager/0.log" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.224813 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.349896 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-j6sk2_3bb0c684-dce6-453f-b3ba-184b11da37c8/kube-rbac-proxy/0.log" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.402403 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities\") pod \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.403718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thmtw\" (UniqueName: \"kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw\") pod \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.405036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content\") pod \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\" (UID: \"d54c4a05-5142-4706-91ba-dbf0d7a50fc1\") " Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.403455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities" (OuterVolumeSpecName: "utilities") pod "d54c4a05-5142-4706-91ba-dbf0d7a50fc1" (UID: "d54c4a05-5142-4706-91ba-dbf0d7a50fc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.413714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw" (OuterVolumeSpecName: "kube-api-access-thmtw") pod "d54c4a05-5142-4706-91ba-dbf0d7a50fc1" (UID: "d54c4a05-5142-4706-91ba-dbf0d7a50fc1"). InnerVolumeSpecName "kube-api-access-thmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.434403 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d54c4a05-5142-4706-91ba-dbf0d7a50fc1" (UID: "d54c4a05-5142-4706-91ba-dbf0d7a50fc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.449072 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-j6sk2_3bb0c684-dce6-453f-b3ba-184b11da37c8/manager/0.log" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.508325 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.508573 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.508702 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thmtw\" (UniqueName: \"kubernetes.io/projected/d54c4a05-5142-4706-91ba-dbf0d7a50fc1-kube-api-access-thmtw\") on node \"crc\" DevicePath \"\"" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.636896 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fb4c2_2deba92d-4689-450c-95e7-36cb8fc196c1/kube-rbac-proxy/0.log" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.697820 4795 generic.go:334] "Generic (PLEG): container finished" podID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerID="43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3" exitCode=0 Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.697884 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerDied","Data":"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3"} Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.698033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pzzsx" event={"ID":"d54c4a05-5142-4706-91ba-dbf0d7a50fc1","Type":"ContainerDied","Data":"886b2664737f20a7383b8a9e312e496605889a251b166382d81cf4c705b4ea82"} Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.698058 4795 scope.go:117] "RemoveContainer" containerID="43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.697952 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pzzsx" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.729183 4795 scope.go:117] "RemoveContainer" containerID="06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.760084 4795 scope.go:117] "RemoveContainer" containerID="6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.838743 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.838785 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pzzsx"] Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.866004 4795 scope.go:117] "RemoveContainer" containerID="43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3" Dec 05 09:52:44 crc kubenswrapper[4795]: E1205 09:52:44.877509 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3\": container with ID starting with 43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3 not found: ID does not exist" containerID="43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.877864 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3"} err="failed to get container status \"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3\": rpc error: code = NotFound desc = could not find container \"43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3\": container with ID starting with 43b3873758f500b91efda66d640acc3a1a612803328b91b077c49ba089d0f1a3 not found: ID does not exist" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.877975 4795 scope.go:117] "RemoveContainer" containerID="06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f" Dec 05 09:52:44 crc kubenswrapper[4795]: E1205 09:52:44.881951 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f\": container with ID starting with 06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f not found: ID does not exist" containerID="06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.882017 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f"} err="failed to get container status \"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f\": rpc error: code = NotFound desc = could not find container \"06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f\": container with ID starting with 06034b8c874287722004cc393b57e01540d3f752c762ff0934eaabdfaedb424f not found: ID does not exist" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.882084 4795 scope.go:117] "RemoveContainer" containerID="6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f" Dec 05 09:52:44 crc kubenswrapper[4795]: E1205 09:52:44.885072 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f\": container with ID starting with 6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f not found: ID does not exist" containerID="6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.885113 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f"} err="failed to get container status \"6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f\": rpc error: code = NotFound desc = could not find container \"6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f\": container with ID starting with 6f97c2615966fee587e58f014a091a41dcdaa73701eda682199360fdf375bd2f not found: ID does not exist" Dec 05 09:52:44 crc kubenswrapper[4795]: I1205 09:52:44.902796 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-fb4c2_2deba92d-4689-450c-95e7-36cb8fc196c1/manager/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.047185 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b9vm2_61b3d615-1654-4fc5-a601-43f68103ac52/kube-rbac-proxy/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.049164 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-b9vm2_61b3d615-1654-4fc5-a601-43f68103ac52/manager/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.189013 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8dh85_8db4219d-3a4b-4470-9c6d-db1b98c9b3dc/kube-rbac-proxy/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.394763 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-8dh85_8db4219d-3a4b-4470-9c6d-db1b98c9b3dc/manager/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.514695 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-tkk5l_63be1623-e1cd-4904-99cb-9497a6596599/kube-rbac-proxy/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.649428 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-tkk5l_63be1623-e1cd-4904-99cb-9497a6596599/manager/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.766937 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-fxncz_df2401aa-47d5-4301-93ea-41a8c8b32cc9/kube-rbac-proxy/0.log" Dec 05 09:52:45 crc kubenswrapper[4795]: I1205 09:52:45.904745 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-fxncz_df2401aa-47d5-4301-93ea-41a8c8b32cc9/manager/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.057855 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-m6snn_e8acf865-8373-4a37-ba22-bc276e596f2d/kube-rbac-proxy/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.123837 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-m6snn_e8acf865-8373-4a37-ba22-bc276e596f2d/manager/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.358931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mbkw8_b41e588d-948f-4709-8717-cfbe8fbba4c9/kube-rbac-proxy/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.406801 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-mbkw8_b41e588d-948f-4709-8717-cfbe8fbba4c9/manager/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.542540 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-n89xx_33394c60-0058-4c0a-8582-cdd95c25bd19/kube-rbac-proxy/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.618411 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-n89xx_33394c60-0058-4c0a-8582-cdd95c25bd19/manager/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.723505 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45xs24_60a49846-77cd-440b-b8b2-988cd340dd18/manager/0.log" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.760991 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" path="/var/lib/kubelet/pods/d54c4a05-5142-4706-91ba-dbf0d7a50fc1/volumes" Dec 05 09:52:46 crc kubenswrapper[4795]: I1205 09:52:46.809001 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd45xs24_60a49846-77cd-440b-b8b2-988cd340dd18/kube-rbac-proxy/0.log" Dec 05 09:52:47 crc kubenswrapper[4795]: I1205 09:52:47.252466 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-q5zwr_74cd8f10-9003-46be-992c-2b23202839bb/registry-server/0.log" Dec 05 09:52:47 crc kubenswrapper[4795]: I1205 09:52:47.287916 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6ccbc6b756-8vnkb_d04526ec-00e4-4fef-8a4f-346bac707512/operator/0.log" Dec 05 09:52:47 crc kubenswrapper[4795]: I1205 09:52:47.821158 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mjbz5_4759b941-a4a1-470a-99e1-9acc898804e9/kube-rbac-proxy/0.log" Dec 05 09:52:47 crc kubenswrapper[4795]: I1205 09:52:47.922687 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-mjbz5_4759b941-a4a1-470a-99e1-9acc898804e9/manager/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.129512 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mjmbn_a0f6ca90-a15b-4fd1-b934-c4428e4c0d90/kube-rbac-proxy/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.364081 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-g6ghd_d83675df-1935-473f-925e-5b40d61aadfa/operator/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.431684 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-mjmbn_a0f6ca90-a15b-4fd1-b934-c4428e4c0d90/manager/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.519123 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-8c7b64495-p2lwl_b993e6ee-cadd-4671-99dd-bb54433c0064/manager/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.546570 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-stfxh_6c624ff1-59b4-4d7a-af4c-0dd48235842b/kube-rbac-proxy/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.640465 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-stfxh_6c624ff1-59b4-4d7a-af4c-0dd48235842b/manager/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.735514 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-vbzr4_75cb47f2-6153-4f9b-9634-151793360092/kube-rbac-proxy/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.870952 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-vbzr4_75cb47f2-6153-4f9b-9634-151793360092/manager/0.log" Dec 05 09:52:48 crc kubenswrapper[4795]: I1205 09:52:48.910392 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xbwwz_4d769f5c-b4d9-4049-9cf8-73d02b343b1f/kube-rbac-proxy/0.log" Dec 05 09:52:49 crc kubenswrapper[4795]: I1205 09:52:49.005313 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-xbwwz_4d769f5c-b4d9-4049-9cf8-73d02b343b1f/manager/0.log" Dec 05 09:52:49 crc kubenswrapper[4795]: I1205 09:52:49.110374 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzc2j_201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3/kube-rbac-proxy/0.log" Dec 05 09:52:49 crc kubenswrapper[4795]: I1205 09:52:49.111975 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-pzc2j_201a6dcf-235e-4ac0-b42f-9dfa86c2ffe3/manager/0.log" Dec 05 09:52:51 crc kubenswrapper[4795]: I1205 09:52:51.747598 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:52:51 crc kubenswrapper[4795]: E1205 09:52:51.748194 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:53:04 crc kubenswrapper[4795]: I1205 09:53:04.752727 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:53:04 crc kubenswrapper[4795]: E1205 09:53:04.753664 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:53:14 crc kubenswrapper[4795]: I1205 09:53:14.063575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x6pjq_59acd2a1-e0cc-439c-9e9e-a2ca39e05e52/kube-rbac-proxy/0.log" Dec 05 09:53:14 crc kubenswrapper[4795]: I1205 09:53:14.063705 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2ffkd_34897f01-f688-4dc5-8fdc-4468365baa92/control-plane-machine-set-operator/0.log" Dec 05 09:53:14 crc kubenswrapper[4795]: I1205 09:53:14.096757 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-x6pjq_59acd2a1-e0cc-439c-9e9e-a2ca39e05e52/machine-api-operator/0.log" Dec 05 09:53:18 crc kubenswrapper[4795]: I1205 09:53:18.755875 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:53:18 crc kubenswrapper[4795]: E1205 09:53:18.756832 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:53:29 crc kubenswrapper[4795]: I1205 09:53:29.187201 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-776ch_0e5cb2d0-ad47-445f-b16b-e7f05a616aed/cert-manager-controller/0.log" Dec 05 09:53:29 crc kubenswrapper[4795]: I1205 09:53:29.260783 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-bwd4b_f7d36052-3c5d-4bc4-b8c9-82efe88058d7/cert-manager-cainjector/0.log" Dec 05 09:53:29 crc kubenswrapper[4795]: I1205 09:53:29.413808 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-97fmt_b29439d4-ea88-4ced-ae64-e4926a6d9826/cert-manager-webhook/0.log" Dec 05 09:53:32 crc kubenswrapper[4795]: I1205 09:53:32.748779 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:53:32 crc kubenswrapper[4795]: E1205 09:53:32.749973 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:53:45 crc kubenswrapper[4795]: I1205 09:53:45.747947 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:53:45 crc kubenswrapper[4795]: E1205 09:53:45.749213 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:53:46 crc kubenswrapper[4795]: I1205 09:53:46.767941 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-c5s7r_e7e2a5ca-6fc7-44be-8d9b-afb4044e5be6/nmstate-console-plugin/0.log" Dec 05 09:53:46 crc kubenswrapper[4795]: I1205 09:53:46.822991 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fhm2n_8513d59a-88fb-4414-bfa8-e5fcfc599cca/nmstate-handler/0.log" Dec 05 09:53:47 crc kubenswrapper[4795]: I1205 09:53:47.083634 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4zm26_347909b2-eaf4-4b55-becf-cde638716053/nmstate-metrics/0.log" Dec 05 09:53:47 crc kubenswrapper[4795]: I1205 09:53:47.124693 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4zm26_347909b2-eaf4-4b55-becf-cde638716053/kube-rbac-proxy/0.log" Dec 05 09:53:47 crc kubenswrapper[4795]: I1205 09:53:47.233125 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-znzrz_d640de60-a6a0-4c76-8fa4-4370edd7363b/nmstate-operator/0.log" Dec 05 09:53:47 crc kubenswrapper[4795]: I1205 09:53:47.338259 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-6bqfk_8b96bb54-83d3-4c36-a347-bad33a85e746/nmstate-webhook/0.log" Dec 05 09:53:58 crc kubenswrapper[4795]: I1205 09:53:58.757468 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:53:58 crc kubenswrapper[4795]: E1205 09:53:58.758404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:54:05 crc kubenswrapper[4795]: I1205 09:54:05.690922 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-rzhjw_a92afa97-4689-4d07-aae0-1294479b1198/kube-rbac-proxy/0.log" Dec 05 09:54:05 crc kubenswrapper[4795]: I1205 09:54:05.810543 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-rzhjw_a92afa97-4689-4d07-aae0-1294479b1198/controller/0.log" Dec 05 09:54:05 crc kubenswrapper[4795]: I1205 09:54:05.997519 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-frr-files/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.244986 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-reloader/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.297960 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-metrics/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.298785 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-frr-files/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.365082 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-reloader/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.597170 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-reloader/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.649421 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-frr-files/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.664343 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-metrics/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.715815 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-metrics/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.982915 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-metrics/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.987662 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-reloader/0.log" Dec 05 09:54:06 crc kubenswrapper[4795]: I1205 09:54:06.990709 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/cp-frr-files/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.029801 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/controller/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.235268 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/kube-rbac-proxy/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.339906 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/frr-metrics/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.379483 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/kube-rbac-proxy-frr/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.619922 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/reloader/0.log" Dec 05 09:54:07 crc kubenswrapper[4795]: I1205 09:54:07.805534 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-q56zl_2297e0a2-10ff-47d9-8acf-c94bf4bddc9f/frr-k8s-webhook-server/0.log" Dec 05 09:54:08 crc kubenswrapper[4795]: I1205 09:54:08.229095 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-f466d6f9b-tnbv6_3a2de58b-f473-4a10-a1c3-1286a5a28aa3/manager/0.log" Dec 05 09:54:08 crc kubenswrapper[4795]: I1205 09:54:08.353524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6f86ff657c-whbk7_b4285b69-fac1-4122-ae5c-8017dcd83316/webhook-server/0.log" Dec 05 09:54:08 crc kubenswrapper[4795]: I1205 09:54:08.564857 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jsnlr_3b63dece-6484-4464-b6a2-c8dcdbb34eae/kube-rbac-proxy/0.log" Dec 05 09:54:09 crc kubenswrapper[4795]: I1205 09:54:09.175208 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-9m78n_e1652139-9bce-404b-a089-375e6023dc34/frr/0.log" Dec 05 09:54:09 crc kubenswrapper[4795]: I1205 09:54:09.341082 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jsnlr_3b63dece-6484-4464-b6a2-c8dcdbb34eae/speaker/0.log" Dec 05 09:54:13 crc kubenswrapper[4795]: I1205 09:54:13.749015 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:54:13 crc kubenswrapper[4795]: E1205 09:54:13.750537 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:54:24 crc kubenswrapper[4795]: I1205 09:54:24.747972 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:54:24 crc kubenswrapper[4795]: E1205 09:54:24.748871 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.355841 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/util/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.563799 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/pull/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.593124 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/util/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.624881 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/pull/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.875365 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/pull/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.907717 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/extract/0.log" Dec 05 09:54:25 crc kubenswrapper[4795]: I1205 09:54:25.935535 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fr8k7w_ff41acb4-73bc-4c91-9556-ef40bd698fd1/util/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.107570 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/util/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.303530 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/util/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.334784 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/pull/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.383081 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/pull/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.664230 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/util/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.709267 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/extract/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.723241 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83jczh9_3cd05f4a-f1f2-4d26-bbd8-1216247ed955/pull/0.log" Dec 05 09:54:26 crc kubenswrapper[4795]: I1205 09:54:26.906122 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-utilities/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.137782 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-utilities/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.218997 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-content/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.226887 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-content/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.430005 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-utilities/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.558051 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/extract-content/0.log" Dec 05 09:54:27 crc kubenswrapper[4795]: I1205 09:54:27.880146 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-utilities/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.067548 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-utilities/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.089933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-content/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.221132 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-content/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.275699 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-llggw_ac916cee-7b91-43d8-9383-029f9b983c8d/registry-server/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.468012 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-content/0.log" Dec 05 09:54:28 crc kubenswrapper[4795]: I1205 09:54:28.468481 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/extract-utilities/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.023777 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nz6vn_5e854247-6adf-4f84-96b8-083f8772d8eb/marketplace-operator/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.082014 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qhq5q_a5f6872a-85c6-4e28-a9ad-dcea0fde3f1e/registry-server/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.111683 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-utilities/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.397343 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-utilities/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.400286 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-content/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.450285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-content/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.721762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-utilities/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.793087 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-utilities/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.915064 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/extract-content/0.log" Dec 05 09:54:29 crc kubenswrapper[4795]: I1205 09:54:29.986479 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zrx88_56e94ad9-4c99-4fa8-bb1e-540fadd9410c/registry-server/0.log" Dec 05 09:54:30 crc kubenswrapper[4795]: I1205 09:54:30.110321 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-content/0.log" Dec 05 09:54:30 crc kubenswrapper[4795]: I1205 09:54:30.132450 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-utilities/0.log" Dec 05 09:54:30 crc kubenswrapper[4795]: I1205 09:54:30.174917 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-content/0.log" Dec 05 09:54:30 crc kubenswrapper[4795]: I1205 09:54:30.402011 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-content/0.log" Dec 05 09:54:30 crc kubenswrapper[4795]: I1205 09:54:30.528888 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/extract-utilities/0.log" Dec 05 09:54:31 crc kubenswrapper[4795]: I1205 09:54:31.161906 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-cccsq_c66e50bc-47ed-41a6-9b2c-ee811bbe0e6e/registry-server/0.log" Dec 05 09:54:38 crc kubenswrapper[4795]: I1205 09:54:38.754794 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:54:38 crc kubenswrapper[4795]: E1205 09:54:38.755574 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:54:50 crc kubenswrapper[4795]: I1205 09:54:50.747531 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:54:50 crc kubenswrapper[4795]: E1205 09:54:50.748185 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:55:01 crc kubenswrapper[4795]: I1205 09:55:01.748732 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:55:01 crc kubenswrapper[4795]: E1205 09:55:01.752410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:55:13 crc kubenswrapper[4795]: I1205 09:55:13.747894 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:55:13 crc kubenswrapper[4795]: E1205 09:55:13.749126 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:55:17 crc kubenswrapper[4795]: E1205 09:55:17.155505 4795 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:53464->38.102.83.222:40791: write tcp 38.102.83.222:53464->38.102.83.222:40791: write: broken pipe Dec 05 09:55:25 crc kubenswrapper[4795]: I1205 09:55:25.749104 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:55:25 crc kubenswrapper[4795]: E1205 09:55:25.749950 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:55:39 crc kubenswrapper[4795]: I1205 09:55:39.747858 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:55:39 crc kubenswrapper[4795]: E1205 09:55:39.748522 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:55:53 crc kubenswrapper[4795]: I1205 09:55:53.748234 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:55:53 crc kubenswrapper[4795]: E1205 09:55:53.749193 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:56:06 crc kubenswrapper[4795]: I1205 09:56:06.748342 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:56:06 crc kubenswrapper[4795]: E1205 09:56:06.748990 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:56:20 crc kubenswrapper[4795]: I1205 09:56:20.747628 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:56:20 crc kubenswrapper[4795]: E1205 09:56:20.748377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:56:34 crc kubenswrapper[4795]: I1205 09:56:34.748250 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:56:34 crc kubenswrapper[4795]: E1205 09:56:34.749271 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:56:44 crc kubenswrapper[4795]: I1205 09:56:44.694346 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:56:44 crc kubenswrapper[4795]: I1205 09:56:44.695046 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="09a55d95-050f-4262-9bb4-7dc81ae6ea34" containerName="galera" probeResult="failure" output="command timed out" Dec 05 09:56:47 crc kubenswrapper[4795]: I1205 09:56:47.747842 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:56:47 crc kubenswrapper[4795]: E1205 09:56:47.751115 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:56:58 crc kubenswrapper[4795]: I1205 09:56:58.757069 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:56:58 crc kubenswrapper[4795]: E1205 09:56:58.757838 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:57:10 crc kubenswrapper[4795]: I1205 09:57:10.748174 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:57:10 crc kubenswrapper[4795]: E1205 09:57:10.749041 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t68zt_openshift-machine-config-operator(23494e8d-0824-46a2-9b0c-c447f1d5e5d0)\"" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" Dec 05 09:57:22 crc kubenswrapper[4795]: I1205 09:57:22.753775 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 09:57:24 crc kubenswrapper[4795]: I1205 09:57:24.684732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"55d879d76702599e75ae390a46a00e3422841d9f2b7bb9d4418d2c64541233fc"} Dec 05 09:57:25 crc kubenswrapper[4795]: I1205 09:57:25.696317 4795 generic.go:334] "Generic (PLEG): container finished" podID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerID="b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29" exitCode=0 Dec 05 09:57:25 crc kubenswrapper[4795]: I1205 09:57:25.696394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" event={"ID":"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397","Type":"ContainerDied","Data":"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29"} Dec 05 09:57:25 crc kubenswrapper[4795]: I1205 09:57:25.697391 4795 scope.go:117] "RemoveContainer" containerID="b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29" Dec 05 09:57:26 crc kubenswrapper[4795]: I1205 09:57:26.674808 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r7g2q_must-gather-t9zkl_76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397/gather/0.log" Dec 05 09:57:36 crc kubenswrapper[4795]: I1205 09:57:36.811804 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-r7g2q/must-gather-t9zkl"] Dec 05 09:57:36 crc kubenswrapper[4795]: I1205 09:57:36.812758 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="copy" containerID="cri-o://383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65" gracePeriod=2 Dec 05 09:57:36 crc kubenswrapper[4795]: I1205 09:57:36.831415 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-r7g2q/must-gather-t9zkl"] Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.710213 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r7g2q_must-gather-t9zkl_76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397/copy/0.log" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.711420 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.768634 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output\") pod \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.768756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46klk\" (UniqueName: \"kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk\") pod \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\" (UID: \"76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397\") " Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.776750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk" (OuterVolumeSpecName: "kube-api-access-46klk") pod "76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" (UID: "76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397"). InnerVolumeSpecName "kube-api-access-46klk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.840670 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-r7g2q_must-gather-t9zkl_76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397/copy/0.log" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.847955 4795 generic.go:334] "Generic (PLEG): container finished" podID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerID="383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65" exitCode=143 Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.848070 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r7g2q/must-gather-t9zkl" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.848142 4795 scope.go:117] "RemoveContainer" containerID="383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.874394 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46klk\" (UniqueName: \"kubernetes.io/projected/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-kube-api-access-46klk\") on node \"crc\" DevicePath \"\"" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.883715 4795 scope.go:117] "RemoveContainer" containerID="b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.956498 4795 scope.go:117] "RemoveContainer" containerID="383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65" Dec 05 09:57:37 crc kubenswrapper[4795]: E1205 09:57:37.957418 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65\": container with ID starting with 383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65 not found: ID does not exist" containerID="383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.957450 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65"} err="failed to get container status \"383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65\": rpc error: code = NotFound desc = could not find container \"383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65\": container with ID starting with 383e171b55bc96895bee96c6c74b66e8a94ffff096bbc968f2825aa44eaf6d65 not found: ID does not exist" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.957474 4795 scope.go:117] "RemoveContainer" containerID="b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29" Dec 05 09:57:37 crc kubenswrapper[4795]: E1205 09:57:37.958636 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29\": container with ID starting with b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29 not found: ID does not exist" containerID="b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.958729 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29"} err="failed to get container status \"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29\": rpc error: code = NotFound desc = could not find container \"b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29\": container with ID starting with b952d983fb6fea80ae7ba9042e4ff0f53c622748dc6b809774deab172a343c29 not found: ID does not exist" Dec 05 09:57:37 crc kubenswrapper[4795]: I1205 09:57:37.996217 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" (UID: "76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:57:38 crc kubenswrapper[4795]: I1205 09:57:38.078790 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 05 09:57:38 crc kubenswrapper[4795]: I1205 09:57:38.758030 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" path="/var/lib/kubelet/pods/76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397/volumes" Dec 05 09:57:55 crc kubenswrapper[4795]: I1205 09:57:55.425268 4795 scope.go:117] "RemoveContainer" containerID="9d3a8b0923057ffcfa34d2c704d877a8653d7953f0d6df091b2506d87cfad516" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.589048 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:05 crc kubenswrapper[4795]: E1205 09:58:05.589988 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="extract-utilities" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590002 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="extract-utilities" Dec 05 09:58:05 crc kubenswrapper[4795]: E1205 09:58:05.590029 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="gather" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590035 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="gather" Dec 05 09:58:05 crc kubenswrapper[4795]: E1205 09:58:05.590053 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="registry-server" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590060 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="registry-server" Dec 05 09:58:05 crc kubenswrapper[4795]: E1205 09:58:05.590077 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="copy" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590085 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="copy" Dec 05 09:58:05 crc kubenswrapper[4795]: E1205 09:58:05.590113 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="extract-content" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590121 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="extract-content" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="gather" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590319 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="76eb2c1b-bd2e-4d60-b0a3-9d2ff41f9397" containerName="copy" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.590332 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54c4a05-5142-4706-91ba-dbf0d7a50fc1" containerName="registry-server" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.591847 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.621754 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.695290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7knqz\" (UniqueName: \"kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.695423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.695890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.797713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.797819 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7knqz\" (UniqueName: \"kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.797847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.798352 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.798481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.830631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7knqz\" (UniqueName: \"kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz\") pod \"certified-operators-v4q6q\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:05 crc kubenswrapper[4795]: I1205 09:58:05.928471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:06 crc kubenswrapper[4795]: I1205 09:58:06.579005 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:07 crc kubenswrapper[4795]: I1205 09:58:07.158009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerStarted","Data":"f2dee244cae4d631d1a8f560f1eac07a5e1886ac1a0745a6b3271f5fa79ba82b"} Dec 05 09:58:08 crc kubenswrapper[4795]: I1205 09:58:08.170357 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerID="0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a" exitCode=0 Dec 05 09:58:08 crc kubenswrapper[4795]: I1205 09:58:08.170473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerDied","Data":"0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a"} Dec 05 09:58:08 crc kubenswrapper[4795]: I1205 09:58:08.173283 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 05 09:58:10 crc kubenswrapper[4795]: I1205 09:58:10.193853 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerID="2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155" exitCode=0 Dec 05 09:58:10 crc kubenswrapper[4795]: I1205 09:58:10.193908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerDied","Data":"2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155"} Dec 05 09:58:11 crc kubenswrapper[4795]: I1205 09:58:11.208265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerStarted","Data":"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3"} Dec 05 09:58:11 crc kubenswrapper[4795]: I1205 09:58:11.231483 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v4q6q" podStartSLOduration=3.8399762969999998 podStartE2EDuration="6.231460717s" podCreationTimestamp="2025-12-05 09:58:05 +0000 UTC" firstStartedPulling="2025-12-05 09:58:08.172954858 +0000 UTC m=+5639.745558597" lastFinishedPulling="2025-12-05 09:58:10.564439278 +0000 UTC m=+5642.137043017" observedRunningTime="2025-12-05 09:58:11.227438468 +0000 UTC m=+5642.800042207" watchObservedRunningTime="2025-12-05 09:58:11.231460717 +0000 UTC m=+5642.804064466" Dec 05 09:58:15 crc kubenswrapper[4795]: I1205 09:58:15.929217 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:15 crc kubenswrapper[4795]: I1205 09:58:15.929833 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:15 crc kubenswrapper[4795]: I1205 09:58:15.978296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:16 crc kubenswrapper[4795]: I1205 09:58:16.313117 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:16 crc kubenswrapper[4795]: I1205 09:58:16.368996 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.286417 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v4q6q" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="registry-server" containerID="cri-o://23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3" gracePeriod=2 Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.733080 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.781800 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content\") pod \"7a422b1f-30ad-44ea-8599-fe0ab523036f\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.781915 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities\") pod \"7a422b1f-30ad-44ea-8599-fe0ab523036f\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.782054 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7knqz\" (UniqueName: \"kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz\") pod \"7a422b1f-30ad-44ea-8599-fe0ab523036f\" (UID: \"7a422b1f-30ad-44ea-8599-fe0ab523036f\") " Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.783077 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities" (OuterVolumeSpecName: "utilities") pod "7a422b1f-30ad-44ea-8599-fe0ab523036f" (UID: "7a422b1f-30ad-44ea-8599-fe0ab523036f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.799793 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz" (OuterVolumeSpecName: "kube-api-access-7knqz") pod "7a422b1f-30ad-44ea-8599-fe0ab523036f" (UID: "7a422b1f-30ad-44ea-8599-fe0ab523036f"). InnerVolumeSpecName "kube-api-access-7knqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.836851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a422b1f-30ad-44ea-8599-fe0ab523036f" (UID: "7a422b1f-30ad-44ea-8599-fe0ab523036f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.884018 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7knqz\" (UniqueName: \"kubernetes.io/projected/7a422b1f-30ad-44ea-8599-fe0ab523036f-kube-api-access-7knqz\") on node \"crc\" DevicePath \"\"" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.884056 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:58:18 crc kubenswrapper[4795]: I1205 09:58:18.884066 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a422b1f-30ad-44ea-8599-fe0ab523036f-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.305234 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerID="23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3" exitCode=0 Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.305456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerDied","Data":"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3"} Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.305593 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4q6q" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.305600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4q6q" event={"ID":"7a422b1f-30ad-44ea-8599-fe0ab523036f","Type":"ContainerDied","Data":"f2dee244cae4d631d1a8f560f1eac07a5e1886ac1a0745a6b3271f5fa79ba82b"} Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.305708 4795 scope.go:117] "RemoveContainer" containerID="23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.327418 4795 scope.go:117] "RemoveContainer" containerID="2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.352191 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.363434 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v4q6q"] Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.364566 4795 scope.go:117] "RemoveContainer" containerID="0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.413665 4795 scope.go:117] "RemoveContainer" containerID="23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3" Dec 05 09:58:19 crc kubenswrapper[4795]: E1205 09:58:19.414230 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3\": container with ID starting with 23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3 not found: ID does not exist" containerID="23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.414266 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3"} err="failed to get container status \"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3\": rpc error: code = NotFound desc = could not find container \"23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3\": container with ID starting with 23526df31564db4d6cd3901d487bae33f8d0f8f9777d072878d416e5e6667ff3 not found: ID does not exist" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.414295 4795 scope.go:117] "RemoveContainer" containerID="2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155" Dec 05 09:58:19 crc kubenswrapper[4795]: E1205 09:58:19.414670 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155\": container with ID starting with 2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155 not found: ID does not exist" containerID="2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.414702 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155"} err="failed to get container status \"2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155\": rpc error: code = NotFound desc = could not find container \"2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155\": container with ID starting with 2adb05eb485e6cd5be96f09e2fdbcec2c91903193ca1b41598b692ec2532f155 not found: ID does not exist" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.414717 4795 scope.go:117] "RemoveContainer" containerID="0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a" Dec 05 09:58:19 crc kubenswrapper[4795]: E1205 09:58:19.415027 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a\": container with ID starting with 0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a not found: ID does not exist" containerID="0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a" Dec 05 09:58:19 crc kubenswrapper[4795]: I1205 09:58:19.415048 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a"} err="failed to get container status \"0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a\": rpc error: code = NotFound desc = could not find container \"0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a\": container with ID starting with 0afcb50f3dc00ecd9bfd0bc1bab1cdccb780a285b7c5bd541c47c9f700ae7a1a not found: ID does not exist" Dec 05 09:58:20 crc kubenswrapper[4795]: I1205 09:58:20.761204 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" path="/var/lib/kubelet/pods/7a422b1f-30ad-44ea-8599-fe0ab523036f/volumes" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.244915 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:36 crc kubenswrapper[4795]: E1205 09:59:36.246051 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="extract-utilities" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.246071 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="extract-utilities" Dec 05 09:59:36 crc kubenswrapper[4795]: E1205 09:59:36.246089 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="registry-server" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.247116 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="registry-server" Dec 05 09:59:36 crc kubenswrapper[4795]: E1205 09:59:36.247158 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="extract-content" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.247170 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="extract-content" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.247526 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a422b1f-30ad-44ea-8599-fe0ab523036f" containerName="registry-server" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.249579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.451127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.548867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.548938 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrxj\" (UniqueName: \"kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.549119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.651074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.651178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.651207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrxj\" (UniqueName: \"kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.651883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.651963 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.677651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrxj\" (UniqueName: \"kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj\") pod \"redhat-operators-fdwds\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:36 crc kubenswrapper[4795]: I1205 09:59:36.781372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:37 crc kubenswrapper[4795]: I1205 09:59:37.196860 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:38 crc kubenswrapper[4795]: I1205 09:59:38.163779 4795 generic.go:334] "Generic (PLEG): container finished" podID="31850ce3-2f75-45b8-a713-d09863eff785" containerID="bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280" exitCode=0 Dec 05 09:59:38 crc kubenswrapper[4795]: I1205 09:59:38.164006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerDied","Data":"bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280"} Dec 05 09:59:38 crc kubenswrapper[4795]: I1205 09:59:38.164059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerStarted","Data":"1524f6f191828ff37f49ff84610cd4af4bca1e74dc8eb80450f8a5373c87aa00"} Dec 05 09:59:39 crc kubenswrapper[4795]: I1205 09:59:39.175366 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerStarted","Data":"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0"} Dec 05 09:59:40 crc kubenswrapper[4795]: I1205 09:59:40.827352 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 09:59:40 crc kubenswrapper[4795]: I1205 09:59:40.827804 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 09:59:44 crc kubenswrapper[4795]: I1205 09:59:44.264034 4795 generic.go:334] "Generic (PLEG): container finished" podID="31850ce3-2f75-45b8-a713-d09863eff785" containerID="62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0" exitCode=0 Dec 05 09:59:44 crc kubenswrapper[4795]: I1205 09:59:44.264573 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerDied","Data":"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0"} Dec 05 09:59:46 crc kubenswrapper[4795]: I1205 09:59:46.290666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerStarted","Data":"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b"} Dec 05 09:59:46 crc kubenswrapper[4795]: I1205 09:59:46.319864 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fdwds" podStartSLOduration=2.8889557679999998 podStartE2EDuration="10.319843668s" podCreationTimestamp="2025-12-05 09:59:36 +0000 UTC" firstStartedPulling="2025-12-05 09:59:38.166372748 +0000 UTC m=+5729.738976487" lastFinishedPulling="2025-12-05 09:59:45.597260658 +0000 UTC m=+5737.169864387" observedRunningTime="2025-12-05 09:59:46.311533912 +0000 UTC m=+5737.884137651" watchObservedRunningTime="2025-12-05 09:59:46.319843668 +0000 UTC m=+5737.892447407" Dec 05 09:59:46 crc kubenswrapper[4795]: I1205 09:59:46.782547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:46 crc kubenswrapper[4795]: I1205 09:59:46.782596 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:47 crc kubenswrapper[4795]: I1205 09:59:47.842393 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fdwds" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="registry-server" probeResult="failure" output=< Dec 05 09:59:47 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Dec 05 09:59:47 crc kubenswrapper[4795]: > Dec 05 09:59:56 crc kubenswrapper[4795]: I1205 09:59:56.843467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:56 crc kubenswrapper[4795]: I1205 09:59:56.901307 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:57 crc kubenswrapper[4795]: I1205 09:59:57.097342 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:58 crc kubenswrapper[4795]: I1205 09:59:58.412562 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fdwds" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="registry-server" containerID="cri-o://34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b" gracePeriod=2 Dec 05 09:59:58 crc kubenswrapper[4795]: I1205 09:59:58.956827 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.071247 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content\") pod \"31850ce3-2f75-45b8-a713-d09863eff785\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.071468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrxj\" (UniqueName: \"kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj\") pod \"31850ce3-2f75-45b8-a713-d09863eff785\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.071641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities\") pod \"31850ce3-2f75-45b8-a713-d09863eff785\" (UID: \"31850ce3-2f75-45b8-a713-d09863eff785\") " Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.072194 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities" (OuterVolumeSpecName: "utilities") pod "31850ce3-2f75-45b8-a713-d09863eff785" (UID: "31850ce3-2f75-45b8-a713-d09863eff785"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.080038 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj" (OuterVolumeSpecName: "kube-api-access-vbrxj") pod "31850ce3-2f75-45b8-a713-d09863eff785" (UID: "31850ce3-2f75-45b8-a713-d09863eff785"). InnerVolumeSpecName "kube-api-access-vbrxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.174351 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.174389 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrxj\" (UniqueName: \"kubernetes.io/projected/31850ce3-2f75-45b8-a713-d09863eff785-kube-api-access-vbrxj\") on node \"crc\" DevicePath \"\"" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.196921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31850ce3-2f75-45b8-a713-d09863eff785" (UID: "31850ce3-2f75-45b8-a713-d09863eff785"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.277127 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31850ce3-2f75-45b8-a713-d09863eff785-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.424848 4795 generic.go:334] "Generic (PLEG): container finished" podID="31850ce3-2f75-45b8-a713-d09863eff785" containerID="34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b" exitCode=0 Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.424904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerDied","Data":"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b"} Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.424943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdwds" event={"ID":"31850ce3-2f75-45b8-a713-d09863eff785","Type":"ContainerDied","Data":"1524f6f191828ff37f49ff84610cd4af4bca1e74dc8eb80450f8a5373c87aa00"} Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.424966 4795 scope.go:117] "RemoveContainer" containerID="34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.425375 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdwds" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.457626 4795 scope.go:117] "RemoveContainer" containerID="62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.464367 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.475388 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fdwds"] Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.494163 4795 scope.go:117] "RemoveContainer" containerID="bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.544467 4795 scope.go:117] "RemoveContainer" containerID="34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b" Dec 05 09:59:59 crc kubenswrapper[4795]: E1205 09:59:59.545001 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b\": container with ID starting with 34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b not found: ID does not exist" containerID="34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.545237 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b"} err="failed to get container status \"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b\": rpc error: code = NotFound desc = could not find container \"34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b\": container with ID starting with 34d4c26c65ea6cef2a6c475e0f650d03021b8e14b1b5418846085625da795b2b not found: ID does not exist" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.545263 4795 scope.go:117] "RemoveContainer" containerID="62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0" Dec 05 09:59:59 crc kubenswrapper[4795]: E1205 09:59:59.545865 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0\": container with ID starting with 62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0 not found: ID does not exist" containerID="62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.545933 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0"} err="failed to get container status \"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0\": rpc error: code = NotFound desc = could not find container \"62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0\": container with ID starting with 62e3b5a5fe6045561c95d12fb4d8e76e4afb57e88f68c55b4b4414e23c4701f0 not found: ID does not exist" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.545975 4795 scope.go:117] "RemoveContainer" containerID="bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280" Dec 05 09:59:59 crc kubenswrapper[4795]: E1205 09:59:59.546481 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280\": container with ID starting with bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280 not found: ID does not exist" containerID="bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280" Dec 05 09:59:59 crc kubenswrapper[4795]: I1205 09:59:59.546513 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280"} err="failed to get container status \"bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280\": rpc error: code = NotFound desc = could not find container \"bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280\": container with ID starting with bc723901af5c094c23789a7bffc4dd1b4441f679cbcc93f35d39f17eefd6b280 not found: ID does not exist" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.158481 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t"] Dec 05 10:00:00 crc kubenswrapper[4795]: E1205 10:00:00.159097 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="extract-content" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.159124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="extract-content" Dec 05 10:00:00 crc kubenswrapper[4795]: E1205 10:00:00.159138 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="registry-server" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.159146 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="registry-server" Dec 05 10:00:00 crc kubenswrapper[4795]: E1205 10:00:00.159156 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="extract-utilities" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.159162 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="extract-utilities" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.159449 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="31850ce3-2f75-45b8-a713-d09863eff785" containerName="registry-server" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.160401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.171246 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.171332 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.174408 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t"] Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.308345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.308525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.308571 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9st2\" (UniqueName: \"kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.411002 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.411099 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.411121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9st2\" (UniqueName: \"kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.412388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.419375 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.428418 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9st2\" (UniqueName: \"kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2\") pod \"collect-profiles-29415480-7kw5t\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.504391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:00 crc kubenswrapper[4795]: I1205 10:00:00.763034 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31850ce3-2f75-45b8-a713-d09863eff785" path="/var/lib/kubelet/pods/31850ce3-2f75-45b8-a713-d09863eff785/volumes" Dec 05 10:00:01 crc kubenswrapper[4795]: W1205 10:00:01.028558 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351c59d4_d3d8_4922_9344_7e8868ad8549.slice/crio-df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f WatchSource:0}: Error finding container df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f: Status 404 returned error can't find the container with id df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f Dec 05 10:00:01 crc kubenswrapper[4795]: I1205 10:00:01.031571 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t"] Dec 05 10:00:01 crc kubenswrapper[4795]: I1205 10:00:01.453972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" event={"ID":"351c59d4-d3d8-4922-9344-7e8868ad8549","Type":"ContainerStarted","Data":"6f3dd796e98905f43d0cb8f4db9af24d710e191ee3caa47e40ebb966b1a90a8d"} Dec 05 10:00:01 crc kubenswrapper[4795]: I1205 10:00:01.454070 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" event={"ID":"351c59d4-d3d8-4922-9344-7e8868ad8549","Type":"ContainerStarted","Data":"df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f"} Dec 05 10:00:01 crc kubenswrapper[4795]: I1205 10:00:01.479307 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" podStartSLOduration=1.479286064 podStartE2EDuration="1.479286064s" podCreationTimestamp="2025-12-05 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 10:00:01.475930423 +0000 UTC m=+5753.048534162" watchObservedRunningTime="2025-12-05 10:00:01.479286064 +0000 UTC m=+5753.051889803" Dec 05 10:00:02 crc kubenswrapper[4795]: I1205 10:00:02.466527 4795 generic.go:334] "Generic (PLEG): container finished" podID="351c59d4-d3d8-4922-9344-7e8868ad8549" containerID="6f3dd796e98905f43d0cb8f4db9af24d710e191ee3caa47e40ebb966b1a90a8d" exitCode=0 Dec 05 10:00:02 crc kubenswrapper[4795]: I1205 10:00:02.466797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" event={"ID":"351c59d4-d3d8-4922-9344-7e8868ad8549","Type":"ContainerDied","Data":"6f3dd796e98905f43d0cb8f4db9af24d710e191ee3caa47e40ebb966b1a90a8d"} Dec 05 10:00:03 crc kubenswrapper[4795]: I1205 10:00:03.897422 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:03 crc kubenswrapper[4795]: I1205 10:00:03.988903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume\") pod \"351c59d4-d3d8-4922-9344-7e8868ad8549\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " Dec 05 10:00:03 crc kubenswrapper[4795]: I1205 10:00:03.989079 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume\") pod \"351c59d4-d3d8-4922-9344-7e8868ad8549\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " Dec 05 10:00:03 crc kubenswrapper[4795]: I1205 10:00:03.989200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9st2\" (UniqueName: \"kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2\") pod \"351c59d4-d3d8-4922-9344-7e8868ad8549\" (UID: \"351c59d4-d3d8-4922-9344-7e8868ad8549\") " Dec 05 10:00:03 crc kubenswrapper[4795]: I1205 10:00:03.997693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume" (OuterVolumeSpecName: "config-volume") pod "351c59d4-d3d8-4922-9344-7e8868ad8549" (UID: "351c59d4-d3d8-4922-9344-7e8868ad8549"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.018956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2" (OuterVolumeSpecName: "kube-api-access-l9st2") pod "351c59d4-d3d8-4922-9344-7e8868ad8549" (UID: "351c59d4-d3d8-4922-9344-7e8868ad8549"). InnerVolumeSpecName "kube-api-access-l9st2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.026869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "351c59d4-d3d8-4922-9344-7e8868ad8549" (UID: "351c59d4-d3d8-4922-9344-7e8868ad8549"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.092109 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/351c59d4-d3d8-4922-9344-7e8868ad8549-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.092572 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9st2\" (UniqueName: \"kubernetes.io/projected/351c59d4-d3d8-4922-9344-7e8868ad8549-kube-api-access-l9st2\") on node \"crc\" DevicePath \"\"" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.092684 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/351c59d4-d3d8-4922-9344-7e8868ad8549-config-volume\") on node \"crc\" DevicePath \"\"" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.488471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" event={"ID":"351c59d4-d3d8-4922-9344-7e8868ad8549","Type":"ContainerDied","Data":"df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f"} Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.488529 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df445473ca3743434c641cdc3ec744cc28177b9371d1ad6c5a994309e0e7a21f" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.488603 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29415480-7kw5t" Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.601538 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf"] Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.613283 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29415435-jztdf"] Dec 05 10:00:04 crc kubenswrapper[4795]: I1205 10:00:04.763388 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b675ff9-bacc-46ea-a03d-5872e4ba173c" path="/var/lib/kubelet/pods/9b675ff9-bacc-46ea-a03d-5872e4ba173c/volumes" Dec 05 10:00:10 crc kubenswrapper[4795]: I1205 10:00:10.826800 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 10:00:10 crc kubenswrapper[4795]: I1205 10:00:10.827228 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 10:00:40 crc kubenswrapper[4795]: I1205 10:00:40.826923 4795 patch_prober.go:28] interesting pod/machine-config-daemon-t68zt container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 05 10:00:40 crc kubenswrapper[4795]: I1205 10:00:40.828634 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 05 10:00:40 crc kubenswrapper[4795]: I1205 10:00:40.831032 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" Dec 05 10:00:40 crc kubenswrapper[4795]: I1205 10:00:40.832092 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55d879d76702599e75ae390a46a00e3422841d9f2b7bb9d4418d2c64541233fc"} pod="openshift-machine-config-operator/machine-config-daemon-t68zt" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 05 10:00:40 crc kubenswrapper[4795]: I1205 10:00:40.832238 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" podUID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerName="machine-config-daemon" containerID="cri-o://55d879d76702599e75ae390a46a00e3422841d9f2b7bb9d4418d2c64541233fc" gracePeriod=600 Dec 05 10:00:41 crc kubenswrapper[4795]: I1205 10:00:41.847084 4795 generic.go:334] "Generic (PLEG): container finished" podID="23494e8d-0824-46a2-9b0c-c447f1d5e5d0" containerID="55d879d76702599e75ae390a46a00e3422841d9f2b7bb9d4418d2c64541233fc" exitCode=0 Dec 05 10:00:41 crc kubenswrapper[4795]: I1205 10:00:41.847161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerDied","Data":"55d879d76702599e75ae390a46a00e3422841d9f2b7bb9d4418d2c64541233fc"} Dec 05 10:00:41 crc kubenswrapper[4795]: I1205 10:00:41.848648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t68zt" event={"ID":"23494e8d-0824-46a2-9b0c-c447f1d5e5d0","Type":"ContainerStarted","Data":"f2eaa1ca2f0028e4364f0513b2fdbe1e43f27b9022df75aeb7af8c3e9bf80aa9"} Dec 05 10:00:41 crc kubenswrapper[4795]: I1205 10:00:41.848678 4795 scope.go:117] "RemoveContainer" containerID="d9a8dede70a0e95ee29b4fb68a1ccc1e431f49c41bb95a6c73ba2a2fbb51eca6" Dec 05 10:00:55 crc kubenswrapper[4795]: I1205 10:00:55.574309 4795 scope.go:117] "RemoveContainer" containerID="fdc0b3b270a490098118e9c58f4b0f531b3a657b60d7b4cd84bf0910a30887ec" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.161707 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29415481-bqtsw"] Dec 05 10:01:00 crc kubenswrapper[4795]: E1205 10:01:00.162882 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="351c59d4-d3d8-4922-9344-7e8868ad8549" containerName="collect-profiles" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.162904 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="351c59d4-d3d8-4922-9344-7e8868ad8549" containerName="collect-profiles" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.163169 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="351c59d4-d3d8-4922-9344-7e8868ad8549" containerName="collect-profiles" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.164019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.182353 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415481-bqtsw"] Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.280295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.280390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.280443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p242m\" (UniqueName: \"kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.280565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.382803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.382898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p242m\" (UniqueName: \"kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.382956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.383014 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.389882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.391727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.392462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.405535 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p242m\" (UniqueName: \"kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m\") pod \"keystone-cron-29415481-bqtsw\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.517037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:00 crc kubenswrapper[4795]: I1205 10:01:00.990967 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29415481-bqtsw"] Dec 05 10:01:01 crc kubenswrapper[4795]: I1205 10:01:01.057509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415481-bqtsw" event={"ID":"919c9abf-f844-4aad-a1c1-664173abeb0e","Type":"ContainerStarted","Data":"f45489c1a8f319329f9ac1707ce6dc942603b9d8883c209f005de86720fdb375"} Dec 05 10:01:02 crc kubenswrapper[4795]: I1205 10:01:02.069552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415481-bqtsw" event={"ID":"919c9abf-f844-4aad-a1c1-664173abeb0e","Type":"ContainerStarted","Data":"52edc8fc1af83335905f31745a64e62d401003305fb3a5ad9086187682375aac"} Dec 05 10:01:02 crc kubenswrapper[4795]: I1205 10:01:02.092705 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29415481-bqtsw" podStartSLOduration=2.092676968 podStartE2EDuration="2.092676968s" podCreationTimestamp="2025-12-05 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-05 10:01:02.08542386 +0000 UTC m=+5813.658027599" watchObservedRunningTime="2025-12-05 10:01:02.092676968 +0000 UTC m=+5813.665280707" Dec 05 10:01:05 crc kubenswrapper[4795]: I1205 10:01:05.098109 4795 generic.go:334] "Generic (PLEG): container finished" podID="919c9abf-f844-4aad-a1c1-664173abeb0e" containerID="52edc8fc1af83335905f31745a64e62d401003305fb3a5ad9086187682375aac" exitCode=0 Dec 05 10:01:05 crc kubenswrapper[4795]: I1205 10:01:05.098207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415481-bqtsw" event={"ID":"919c9abf-f844-4aad-a1c1-664173abeb0e","Type":"ContainerDied","Data":"52edc8fc1af83335905f31745a64e62d401003305fb3a5ad9086187682375aac"} Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.488048 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.619549 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys\") pod \"919c9abf-f844-4aad-a1c1-664173abeb0e\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.619685 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p242m\" (UniqueName: \"kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m\") pod \"919c9abf-f844-4aad-a1c1-664173abeb0e\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.619707 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle\") pod \"919c9abf-f844-4aad-a1c1-664173abeb0e\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.619870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data\") pod \"919c9abf-f844-4aad-a1c1-664173abeb0e\" (UID: \"919c9abf-f844-4aad-a1c1-664173abeb0e\") " Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.625480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "919c9abf-f844-4aad-a1c1-664173abeb0e" (UID: "919c9abf-f844-4aad-a1c1-664173abeb0e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.627034 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m" (OuterVolumeSpecName: "kube-api-access-p242m") pod "919c9abf-f844-4aad-a1c1-664173abeb0e" (UID: "919c9abf-f844-4aad-a1c1-664173abeb0e"). InnerVolumeSpecName "kube-api-access-p242m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.650488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "919c9abf-f844-4aad-a1c1-664173abeb0e" (UID: "919c9abf-f844-4aad-a1c1-664173abeb0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.680074 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data" (OuterVolumeSpecName: "config-data") pod "919c9abf-f844-4aad-a1c1-664173abeb0e" (UID: "919c9abf-f844-4aad-a1c1-664173abeb0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.724117 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.724162 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p242m\" (UniqueName: \"kubernetes.io/projected/919c9abf-f844-4aad-a1c1-664173abeb0e-kube-api-access-p242m\") on node \"crc\" DevicePath \"\"" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.724177 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 05 10:01:06 crc kubenswrapper[4795]: I1205 10:01:06.724190 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/919c9abf-f844-4aad-a1c1-664173abeb0e-config-data\") on node \"crc\" DevicePath \"\"" Dec 05 10:01:07 crc kubenswrapper[4795]: I1205 10:01:07.122501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29415481-bqtsw" event={"ID":"919c9abf-f844-4aad-a1c1-664173abeb0e","Type":"ContainerDied","Data":"f45489c1a8f319329f9ac1707ce6dc942603b9d8883c209f005de86720fdb375"} Dec 05 10:01:07 crc kubenswrapper[4795]: I1205 10:01:07.122560 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f45489c1a8f319329f9ac1707ce6dc942603b9d8883c209f005de86720fdb375" Dec 05 10:01:07 crc kubenswrapper[4795]: I1205 10:01:07.122674 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29415481-bqtsw" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.487338 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:01:59 crc kubenswrapper[4795]: E1205 10:01:59.488388 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="919c9abf-f844-4aad-a1c1-664173abeb0e" containerName="keystone-cron" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.488405 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="919c9abf-f844-4aad-a1c1-664173abeb0e" containerName="keystone-cron" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.488781 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="919c9abf-f844-4aad-a1c1-664173abeb0e" containerName="keystone-cron" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.490218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.512055 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.679702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.679794 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5k4k\" (UniqueName: \"kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.679826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.781495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.781603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5k4k\" (UniqueName: \"kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.781658 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.782403 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.782669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.806799 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5k4k\" (UniqueName: \"kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k\") pod \"community-operators-zlmfg\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:01:59 crc kubenswrapper[4795]: I1205 10:01:59.874428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:00 crc kubenswrapper[4795]: I1205 10:02:00.510078 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:02:00 crc kubenswrapper[4795]: I1205 10:02:00.689052 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerStarted","Data":"4bc4905acf663cad8ea78903989690d54cb42ca8e97d50bca87f1299b944eb18"} Dec 05 10:02:01 crc kubenswrapper[4795]: I1205 10:02:01.702170 4795 generic.go:334] "Generic (PLEG): container finished" podID="e907abd3-8347-429d-81e9-ac495b51dedd" containerID="e84498aa00e1e83d6e582ec1ef309c8f024534118af2896a5723fdfaa1fd32cb" exitCode=0 Dec 05 10:02:01 crc kubenswrapper[4795]: I1205 10:02:01.702279 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerDied","Data":"e84498aa00e1e83d6e582ec1ef309c8f024534118af2896a5723fdfaa1fd32cb"} Dec 05 10:02:03 crc kubenswrapper[4795]: I1205 10:02:03.722508 4795 generic.go:334] "Generic (PLEG): container finished" podID="e907abd3-8347-429d-81e9-ac495b51dedd" containerID="bc948d0cdf88deabca25c3a9f2ced3b753c51cb3b8f643526abc539a1a820b73" exitCode=0 Dec 05 10:02:03 crc kubenswrapper[4795]: I1205 10:02:03.722650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerDied","Data":"bc948d0cdf88deabca25c3a9f2ced3b753c51cb3b8f643526abc539a1a820b73"} Dec 05 10:02:04 crc kubenswrapper[4795]: I1205 10:02:04.741096 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerStarted","Data":"5ac6c6a85efeacf29ca777c6fce5edc2948978a66aa880aa408dda559a9b0ea6"} Dec 05 10:02:04 crc kubenswrapper[4795]: I1205 10:02:04.771228 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zlmfg" podStartSLOduration=3.184289605 podStartE2EDuration="5.771207103s" podCreationTimestamp="2025-12-05 10:01:59 +0000 UTC" firstStartedPulling="2025-12-05 10:02:01.706137522 +0000 UTC m=+5873.278741261" lastFinishedPulling="2025-12-05 10:02:04.29305502 +0000 UTC m=+5875.865658759" observedRunningTime="2025-12-05 10:02:04.765153308 +0000 UTC m=+5876.337757047" watchObservedRunningTime="2025-12-05 10:02:04.771207103 +0000 UTC m=+5876.343810842" Dec 05 10:02:09 crc kubenswrapper[4795]: I1205 10:02:09.874811 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:09 crc kubenswrapper[4795]: I1205 10:02:09.876325 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:09 crc kubenswrapper[4795]: I1205 10:02:09.934124 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:10 crc kubenswrapper[4795]: I1205 10:02:10.863313 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:10 crc kubenswrapper[4795]: I1205 10:02:10.929181 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:02:12 crc kubenswrapper[4795]: I1205 10:02:12.827792 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zlmfg" podUID="e907abd3-8347-429d-81e9-ac495b51dedd" containerName="registry-server" containerID="cri-o://5ac6c6a85efeacf29ca777c6fce5edc2948978a66aa880aa408dda559a9b0ea6" gracePeriod=2 Dec 05 10:02:15 crc kubenswrapper[4795]: I1205 10:02:15.864001 4795 generic.go:334] "Generic (PLEG): container finished" podID="e907abd3-8347-429d-81e9-ac495b51dedd" containerID="5ac6c6a85efeacf29ca777c6fce5edc2948978a66aa880aa408dda559a9b0ea6" exitCode=0 Dec 05 10:02:15 crc kubenswrapper[4795]: I1205 10:02:15.864324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerDied","Data":"5ac6c6a85efeacf29ca777c6fce5edc2948978a66aa880aa408dda559a9b0ea6"} Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.041131 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.172321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities\") pod \"e907abd3-8347-429d-81e9-ac495b51dedd\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.172662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content\") pod \"e907abd3-8347-429d-81e9-ac495b51dedd\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.172713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5k4k\" (UniqueName: \"kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k\") pod \"e907abd3-8347-429d-81e9-ac495b51dedd\" (UID: \"e907abd3-8347-429d-81e9-ac495b51dedd\") " Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.175770 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities" (OuterVolumeSpecName: "utilities") pod "e907abd3-8347-429d-81e9-ac495b51dedd" (UID: "e907abd3-8347-429d-81e9-ac495b51dedd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.191411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k" (OuterVolumeSpecName: "kube-api-access-j5k4k") pod "e907abd3-8347-429d-81e9-ac495b51dedd" (UID: "e907abd3-8347-429d-81e9-ac495b51dedd"). InnerVolumeSpecName "kube-api-access-j5k4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.224993 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e907abd3-8347-429d-81e9-ac495b51dedd" (UID: "e907abd3-8347-429d-81e9-ac495b51dedd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.274516 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.274550 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5k4k\" (UniqueName: \"kubernetes.io/projected/e907abd3-8347-429d-81e9-ac495b51dedd-kube-api-access-j5k4k\") on node \"crc\" DevicePath \"\"" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.274560 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e907abd3-8347-429d-81e9-ac495b51dedd-utilities\") on node \"crc\" DevicePath \"\"" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.878457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlmfg" event={"ID":"e907abd3-8347-429d-81e9-ac495b51dedd","Type":"ContainerDied","Data":"4bc4905acf663cad8ea78903989690d54cb42ca8e97d50bca87f1299b944eb18"} Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.878528 4795 scope.go:117] "RemoveContainer" containerID="5ac6c6a85efeacf29ca777c6fce5edc2948978a66aa880aa408dda559a9b0ea6" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.879731 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlmfg" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.916217 4795 scope.go:117] "RemoveContainer" containerID="bc948d0cdf88deabca25c3a9f2ced3b753c51cb3b8f643526abc539a1a820b73" Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.917115 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.928018 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zlmfg"] Dec 05 10:02:16 crc kubenswrapper[4795]: I1205 10:02:16.951987 4795 scope.go:117] "RemoveContainer" containerID="e84498aa00e1e83d6e582ec1ef309c8f024534118af2896a5723fdfaa1fd32cb" Dec 05 10:02:18 crc kubenswrapper[4795]: I1205 10:02:18.758951 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e907abd3-8347-429d-81e9-ac495b51dedd" path="/var/lib/kubelet/pods/e907abd3-8347-429d-81e9-ac495b51dedd/volumes"